Fakra, Eric; Jouve, Elisabeth; Guillaume, Fabrice; Azorin, Jean-Michel; Blin, Olivier
2015-03-01
Deficit in facial affect recognition is a well-documented impairment in schizophrenia, closely connected to social outcome. This deficit could be related to psychopathology, but also to a broader dysfunction in processing facial information. In addition, patients with schizophrenia inadequately use configural information-a type of processing that relies on spatial relationships between facial features. To date, no study has specifically examined the link between symptoms and misuse of configural information in the deficit in facial affect recognition. Unmedicated schizophrenia patients (n = 30) and matched healthy controls (n = 30) performed a facial affect recognition task and a face inversion task, which tests aptitude to rely on configural information. In patients, regressions were carried out between facial affect recognition, symptom dimensions and inversion effect. Patients, compared with controls, showed a deficit in facial affect recognition and a lower inversion effect. Negative symptoms and lower inversion effect could account for 41.2% of the variance in facial affect recognition. This study confirms the presence of a deficit in facial affect recognition, and also of dysfunctional manipulation in configural information in antipsychotic-free patients. Negative symptoms and poor processing of configural information explained a substantial part of the deficient recognition of facial affect. We speculate that this deficit may be caused by several factors, among which independently stand psychopathology and failure in correctly manipulating configural information. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Facial affect processing and depression susceptibility: cognitive biases and cognitive neuroscience.
Bistricky, Steven L; Ingram, Rick E; Atchley, Ruth Ann
2011-11-01
Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal experience, cognition, and social behavior. We therefore review the burgeoning depressive facial affect processing literature and examine its potential for integrating disciplines, theories, and research. In particular, we evaluate studies in which information processing or cognitive neuroscience paradigms were used to assess facial affect processing in depressed and depression-susceptible populations. Most studies have assessed and supported cognitive models. This research suggests that depressed and depression-vulnerable groups show abnormal facial affect interpretation, attention, and memory, although findings vary based on depression severity, comorbid anxiety, or length of time faces are viewed. Facial affect processing biases appear to correspond with distinct neural activity patterns and increased depressive emotion and thought. Biases typically emerge in depressed moods but are occasionally found in the absence of such moods. Indirect evidence suggests that childhood neglect might cultivate abnormal facial affect processing, which can impede social functioning in ways consistent with cognitive-interpersonal and interpersonal models. However, reviewed studies provide mixed support for the social risk model prediction that depressive states prompt cognitive hypervigilance to social threat information. We recommend prospective interdisciplinary research examining whether facial affect processing abnormalities promote-or are promoted by-depressogenic attachment experiences, negative thinking, and social dysfunction.
Exploring the nature of facial affect processing deficits in schizophrenia.
van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Cahn, Wiepke; de Haan, Edward H F; Kahn, René S
2007-04-15
Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as well as in controlled evaluation of facial affect. Thirty-seven patients with schizophrenia were compared with 41 control subjects on incidental facial affect processing (gender decision of faces with a fearful, angry, happy, disgusted, and neutral expression) and degraded facial affect labeling (labeling of fearful, angry, happy, and neutral faces). The groups were matched on estimates of verbal and performance intelligence (National Adult Reading Test; Raven's Matrices), general face recognition ability (Benton Face Recognition), and other demographic variables. The results showed that patients with schizophrenia as well as control subjects demonstrate the normal threat-related interference during incidental facial affect processing. Conversely, on controlled evaluation patients were specifically worse in the labeling of fearful faces. In particular, patients with high levels of negative symptoms may be characterized by deficits in labeling fear. We suggest that patients with schizophrenia show no evidence of deficits in the automatic allocation of attention resources to fearful (threat-indicating) faces, but have a deficit in the controlled processing of facial emotions that may be specific for fearful faces.
ERIC Educational Resources Information Center
Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun
2011-01-01
The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…
Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C
2017-01-01
Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.
Neumann, Dawn; McDonald, Brenna C; West, John; Keiski, Michelle A; Wang, Yang
2016-06-01
The neurobiological mechanisms that underlie facial affect recognition deficits after traumatic brain injury (TBI) have not yet been identified. Using functional magnetic resonance imaging (fMRI), study aims were to 1) determine if there are differences in brain activation during facial affect processing in people with TBI who have facial affect recognition impairments (TBI-I) relative to people with TBI and healthy controls who do not have facial affect recognition impairments (TBI-N and HC, respectively); and 2) identify relationships between neural activity and facial affect recognition performance. A facial affect recognition screening task performed outside the scanner was used to determine group classification; TBI patients who performed greater than one standard deviation below normal performance scores were classified as TBI-I, while TBI patients with normal scores were classified as TBI-N. An fMRI facial recognition paradigm was then performed within the 3T environment. Results from 35 participants are reported (TBI-I = 11, TBI-N = 12, and HC = 12). For the fMRI task, TBI-I and TBI-N groups scored significantly lower than the HC group. Blood oxygenation level-dependent (BOLD) signals for facial affect recognition compared to a baseline condition of viewing a scrambled face, revealed lower neural activation in the right fusiform gyrus (FG) in the TBI-I group than the HC group. Right fusiform gyrus activity correlated with accuracy on the facial affect recognition tasks (both within and outside the scanner). Decreased FG activity suggests facial affect recognition deficits after TBI may be the result of impaired holistic face processing. Future directions and clinical implications are discussed.
Wynn, Jonathan K.; Lee, Junghee; Horan, William P.; Green, Michael F.
2008-01-01
Schizophrenia patients show impairments in identifying facial affect; however, it is not known at what stage facial affect processing is impaired. We evaluated 3 event-related potentials (ERPs) to explore stages of facial affect processing in schizophrenia patients. Twenty-six schizophrenia patients and 27 normal controls participated. In separate blocks, subjects identified the gender of a face, the emotion of a face, or if a building had 1 or 2 stories. Three ERPs were examined: (1) P100 to examine basic visual processing, (2) N170 to examine facial feature encoding, and (3) N250 to examine affect decoding. Behavioral performance on each task was also measured. Results showed that schizophrenia patients’ P100 was comparable to the controls during all 3 identification tasks. Both patients and controls exhibited a comparable N170 that was largest during processing of faces and smallest during processing of buildings. For both groups, the N250 was largest during the emotion identification task and smallest for the building identification task. However, the patients produced a smaller N250 compared with the controls across the 3 tasks. The groups did not differ in behavioral performance in any of the 3 identification tasks. The pattern of intact P100 and N170 suggest that patients maintain basic visual processing and facial feature encoding abilities. The abnormal N250 suggests that schizophrenia patients are less efficient at decoding facial affect features. Our results imply that abnormalities in the later stage of feature decoding could potentially underlie emotion identification deficits in schizophrenia. PMID:18499704
Donges, Uta-Susan; Dukalski, Bibiana; Kersting, Anette; Suslow, Thomas
2015-01-01
Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals. The presence of comorbid anxiety disorders could make patients more susceptible to the influence of a happy expression on judgment processes at an automatic processing level. Comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression may enhance automatic attention allocation to threatening facial expressions in BPD. Increased automatic vigilance for social threat stimuli might contribute to affective instability and interpersonal problems in specific patients with BPD.
Identifying differences in biased affective information processing in major depression.
Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F
2008-05-30
This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.
Spapé, M M; Harjunen, Ville; Ravaja, N
2017-03-01
Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.
Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue
2009-06-15
Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.
Stepanova, Elena V; Strube, Michael J
2012-01-01
Participants (N = 106) performed an affective priming task with facial primes that varied in their skin tone and facial physiognomy, and, which were presented either in color or in gray-scale. Participants' racial evaluations were more positive for Eurocentric than for Afrocentric physiognomy faces. Light skin tone faces were evaluated more positively than dark skin tone faces, but the magnitude of this effect depended on the mode of color presentation. The results suggest that in affective priming tasks, faces might not be processed holistically, and instead, visual features of facial priming stimuli independently affect implicit evaluations.
Carroll, Erin M A; Kamboj, Sunjeev K; Conroy, Laura; Tookman, Adrian; Williams, Amanda C de C; Jones, Louise; Morgan, Celia J A; Curran, H Valerie
2011-06-01
As a multidimensional phenomenon, pain is influenced by various psychological factors. One such factor is catastrophizing, which is associated with higher pain intensity and emotional distress in cancer and noncancer pain. One possibility is that catastrophizing represents a general cognitive style that preferentially supports the processing of negative affective stimuli. Such preferential processing of threat--toward negative facial expressions, for example--is seen in emotional disorders and is sensitive to pharmacological treatment. Whether pharmacological (analgesic) treatment might also influence the processing of threat in pain patients is currently unclear. This study investigates the effects catastrophizing on processing of facial affect in those receiving an acute opioid dose. In a double-blind crossover design, the performance of 20 palliative care patients after their usual dose of immediate-release opioid was compared with their performance following matched-placebo administration on a facial affect recognition (i.e., speed and accuracy) and threat-pain estimation task (i.e., ratings of pain intensity). The influence of catastrophizing was examined by splitting the sample according to their score on the Pain Catastrophizing Scale (PCS). Opioid administration had no effect on facial affect processing compared with placebo. However, the main finding was that enhanced processing of fear, sadness, and disgust was found only in patients who scored highly on the PCS. There was no difference in performance between the two PCS groups on the other emotions (i.e., happiness, surprise, and anger). These findings suggest that catastrophizing is associated with an affective information-processing bias in patients with severe pain conditions. Copyright © 2011 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Facial Affect Processing and Depression Susceptibility: Cognitive Biases and Cognitive Neuroscience
ERIC Educational Resources Information Center
Bistricky, Steven L.; Ingram, Rick E.; Atchley, Ruth Ann
2011-01-01
Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal…
Holistic processing of static and moving faces.
Zhao, Mintao; Bülthoff, Isabelle
2017-07-01
Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability-holistic face processing-remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based face processing by manipulating the presence of facial motion during study and at test in a composite face task. The results showed that rigidly moving faces were processed as holistically as static faces (Experiment 1). Holistic processing of moving faces persisted whether facial motion was presented during study, at test, or both (Experiment 2). Moreover, when faces were inverted to eliminate the contributions of both an upright face template and observers' expertise with upright faces, rigid facial motion facilitated holistic face processing (Experiment 3). Thus, holistic processing represents a general principle of face perception that applies to both static and dynamic faces, rather than being limited to static faces. These results support an emerging view that both perceiver-based and face-based factors contribute to holistic face processing, and they offer new insights on what underlies holistic face processing, how information supporting holistic face processing interacts with each other, and why facial motion may affect face recognition and holistic face processing differently. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Tsukiura, Takashi
2012-01-01
In our daily lives, we form some impressions of other people. Although those impressions are affected by many factors, face-based affective signals such as facial expression, facial attractiveness, or trustworthiness are important. Previous psychological studies have demonstrated the impact of facial impressions on remembering other people, but little is known about the neural mechanisms underlying this psychological process. The purpose of this article is to review recent functional MRI (fMRI) studies to investigate the effects of face-based affective signals including facial expression, facial attractiveness, and trustworthiness on memory for faces, and to propose a tentative concept for understanding this affective-cognitive interaction. On the basis of the aforementioned research, three brain regions are potentially involved in the processing of face-based affective signals. The first candidate is the amygdala, where activity is generally modulated by both affectively positive and negative signals from faces. Activity in the orbitofrontal cortex (OFC), as the second candidate, increases as a function of perceived positive signals from faces; whereas activity in the insular cortex, as the third candidate, reflects a function of face-based negative signals. In addition, neuroscientific studies have reported that the three regions are functionally connected to the memory-related hippocampal regions. These findings suggest that the effects of face-based affective signals on memory for faces could be modulated by interactions between the regions associated with the processing of face-based affective signals and the hippocampus as a memory-related region. PMID:22837740
Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia.
Csukly, Gábor; Stefanics, Gábor; Komlósi, Sarolta; Czigler, István; Czobor, Pál
2014-02-01
Growing evidence suggests that abnormalities in the synchronized oscillatory activity of neurons in schizophrenia may lead to impaired neural activation and temporal coding and thus lead to neurocognitive dysfunctions, such as deficits in facial affect recognition. To gain an insight into the neurobiological processes linked to facial affect recognition, we investigated both induced and evoked oscillatory activity by calculating the Event Related Spectral Perturbation (ERSP) and the Inter Trial Coherence (ITC) during facial affect recognition. Fearful and neutral faces as well as nonface patches were presented to 24 patients with schizophrenia and 24 matched healthy controls while EEG was recorded. The participants' task was to recognize facial expressions. Because previous findings with healthy controls showed that facial feature decoding was associated primarily with oscillatory activity in the theta band, we analyzed ERSP and ITC in this frequency band in the time interval of 140-200 ms, which corresponds to the N170 component. Event-related theta activity and phase-locking to facial expressions, but not to nonface patches, predicted emotion recognition performance in both controls and patients. Event-related changes in theta amplitude and phase-locking were found to be significantly weaker in patients compared with healthy controls, which is in line with previous investigations showing decreased neural synchronization in the low frequency bands in patients with schizophrenia. Neural synchrony is thought to underlie distributed information processing. Our results indicate a less effective functioning in the recognition process of facial features, which may contribute to a less effective social cognition in schizophrenia. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Effective connectivity during processing of facial affect: evidence for multiple parallel pathways.
Dima, Danai; Stephan, Klaas E; Roiser, Jonathan P; Friston, Karl J; Frangou, Sophia
2011-10-05
The perception of facial affect engages a distributed cortical network. We used functional magnetic resonance imaging and dynamic causal modeling to characterize effective connectivity during explicit (conscious) categorization of affective stimuli in the human brain. Specifically, we examined the modulation of connectivity from posterior regions of the face-processing network to the lateral ventral prefrontal cortex (VPFC) during affective categorization and we tested for a potential role of the amygdala (AMG) in mediating this modulation. We found that explicit processing of facial affect led to prominent modulation (increase) in the effective connectivity from the inferior occipital gyrus (IOG) to the VPFC, while there was less evidence for modulation of the afferent connections from fusiform gyrus and AMG to VPFC. More specifically, the forward connection from IOG to the VPFC exhibited a selective increase under anger (as opposed to fear or sadness). Furthermore, Bayesian model comparison suggested that the modulation of afferent connections to the VPFC was mediated directly by facial affect, as opposed to an indirect modulation mediated by the AMG. Our results thus suggest that affective information is conveyed to the VPFC along multiple parallel pathways and that AMG activity is not sufficient to account for the gating of information transfer to the VPFC during explicit emotional processing.
Is right hemisphere decline in the perception of emotion a function of aging?
McDowell, C L; Harrison, D W; Demaree, H A
1994-11-01
The hypothesis that the right cerebral hemisphere declines more quickly than the left cerebral hemisphere in the normal aging process was tested using accuracy and intensity measures in a facial recognition test and using response time and response bias measures in a tachistoscopic paradigm. Elderly and younger men and women (N = 60) participated in both experiments. Experiment 1 required facial affect identification and intensity ratings of 50 standardized photographs of 5 affective categories: Happy, Neutral, Sad, Angry, and Fearful. The elderly were significantly less accurate in identifying facial affective valence. This effect was found using negative and neutral expressions. Results for happy expressions, however, were consistent with the younger group. In Experiment 2, age differences in hemispheric asymmetry were evaluated using presentation of affective faces in each visual field. Following prolonged experience with the affective stimuli during Experiment 1, the elderly showed heightened cerebral asymmetry for facial affect processing compared to the younger group. Both groups showed a positive affective bias to neutral stimuli presented to the left hemisphere. Elderly and younger subjects scored significantly higher on Vocabulary and Block Design subtests of the WAIS-R, respectively. Overall, the findings suggest that the elderly have more difficulty processing negative affect, while their ability to process positive affect remains intact. The results lend only partial support to the right hemi-aging hypothesis.
Modulation of α power and functional connectivity during facial affect recognition.
Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan
2013-04-03
Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.
ERIC Educational Resources Information Center
Wang, Shirley S.; Treat, Teresa A.; Brownell, Kelly D.
2008-01-01
This study examines 2 aspects of cognitive processing in person perception--attention and decision making--in classroom-relevant contexts. Teachers completed 2 implicit, performance-based tasks that characterized attention to and utilization of 4 student characteristics of interest: ethnicity, facial affect, body size, and attractiveness. Stimuli…
Anders, Silke; Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand
2012-04-01
Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia-cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia-cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons ('mirror neurons') in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia-cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease.
Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand
2012-01-01
Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia–cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia–cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons (‘mirror neurons’) in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia–cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease. PMID:22434215
Affect of the unconscious: visually suppressed angry faces modulate our decisions.
Almeida, Jorge; Pajtas, Petra E; Mahon, Bradford Z; Nakayama, Ken; Caramazza, Alfonso
2013-03-01
Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item--a Chinese character--that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala.
Leitman, David I; Wolf, Daniel H; Loughead, James; Valdez, Jeffrey N; Kohler, Christian G; Brensinger, Colleen; Elliott, Mark A; Turetsky, Bruce I; Gur, Raquel E; Gur, Ruben C
2011-01-01
Schizophrenia patients display impaired performance and brain activity during facial affect recognition. These impairments may reflect stimulus-driven perceptual decrements and evaluative processing abnormalities. We differentiated these two processes by contrasting responses to identical stimuli presented under different contexts. Seventeen healthy controls and 16 schizophrenia patients performed an fMRI facial affect detection task. Subjects identified an affective target presented amongst foils of differing emotions. We hypothesized that targeting affiliative emotions (happiness, sadness) would create a task demand context distinct from that generated when targeting threat emotions (anger, fear). We compared affiliative foil stimuli within a congruent affiliative context with identical stimuli presented in an incongruent threat context. Threat foils were analysed in the same manner. Controls activated right orbitofrontal cortex (OFC)/ventrolateral prefrontal cortex (VLPFC) more to affiliative foils in threat contexts than to identical stimuli within affiliative contexts. Patients displayed reduced OFC/VLPFC activation to all foils, and no activation modulation by context. This lack of context modulation coincided with a 2-fold decrement in foil detection efficiency. Task demands produce contextual effects during facial affective processing in regions activated during affect evaluation. In schizophrenia, reduced modulation of OFC/VLPFC by context coupled with reduced behavioural efficiency suggests impaired ventral prefrontal control mechanisms that optimize affective appraisal.
Infrared thermal facial image sequence registration analysis and verification
NASA Astrophysics Data System (ADS)
Chen, Chieh-Li; Jian, Bo-Lin
2015-03-01
To study the emotional responses of subjects to the International Affective Picture System (IAPS), infrared thermal facial image sequence is preprocessed for registration before further analysis such that the variance caused by minor and irregular subject movements is reduced. Without affecting the comfort level and inducing minimal harm, this study proposes an infrared thermal facial image sequence registration process that will reduce the deviations caused by the unconscious head shaking of the subjects. A fixed image for registration is produced through the localization of the centroid of the eye region as well as image translation and rotation processes. Thermal image sequencing will then be automatically registered using the two-stage genetic algorithm proposed. The deviation before and after image registration will be demonstrated by image quality indices. The results show that the infrared thermal image sequence registration process proposed in this study is effective in localizing facial images accurately, which will be beneficial to the correlation analysis of psychological information related to the facial area.
Jusyte, Aiste; Schönenberg, Michael
2014-06-30
Facial affect is one of the most important information sources during the course of social interactions, but it is susceptible to distortion due to the complex and dynamic nature. Socially anxious individuals have been shown to exhibit alterations in the processing of social information, such as an attentional and interpretative bias toward threatening information. This may be one of the key factors contributing to the development and maintenance of anxious psychopathology. The aim of the current study was to investigate whether a threat-related interpretation bias is evident for ambiguous facial stimuli in a population of individuals with a generalized Social Anxiety Disorder (gSAD) as compared to healthy controls. Participants judged ambiguous happy/fearful, angry/fearful and angry/happy blends varying in intensity and rated the predominant affective expression. The results obtained in this study do not indicate that gSAD is associated with a biased interpretation of ambiguous facial affect. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
The association between PTSD and facial affect recognition.
Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard
2018-05-05
The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas
2012-01-01
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.
Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas
2012-01-01
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states. PMID:22844519
Affect of the unconscious: Visually suppressed angry faces modulate our decisions
Pajtas, Petra E.; Mahon, Bradford Z.; Nakayama, Ken; Caramazza, Alfonso
2016-01-01
Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item—a Chinese character—that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala. PMID:23224765
Effects of facial color on the subliminal processing of fearful faces.
Nakajima, K; Minami, T; Nakauchi, S
2015-12-03
Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing
2017-01-01
To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention.
Balconi, Michela; Canavesio, Ylenia
2016-01-01
The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.
Developmental trends in the process of constructing own- and other-race facial composites.
Kehn, Andre; Renken, Maggie D; Gray, Jennifer M; Nunez, Narina L
2014-01-01
The current study examined developmental differences from the age of 5 to 18 in the creation process of own- and other-race facial composites. In addition, it considered how differences in the creation process affect similarity ratings. Participants created two composites (one own- and one other-race) from memory. The complexity of the composite creation process was recorded during Phase One. In Phase Two, a separate group of participants rated the composites for similarity to the corresponding target face. Results support the cross-race effect, developmental differences (based on composite creators) in similarity ratings, and the importance of the creation process for own- and other-race facial composites. Together, these findings suggest that as children get older the process through which they create facial composites becomes more complex and their ability to create facial composites improves. Increased complexity resulted in higher rated composites. Results are discussed from a psycho-legal perspective.
A systems view of mother-infant face-to-face communication.
Beebe, Beatrice; Messinger, Daniel; Bahrick, Lorraine E; Margolis, Amy; Buck, Karen A; Chen, Henian
2016-04-01
Principles of a dynamic, dyadic systems view of mother-infant face-to-face communication, which considers self- and interactive processes in relation to one another, were tested. The process of interaction across time in a large low-risk community sample at infant age 4 months was examined. Split-screen videotape was coded on a 1-s time base for communication modalities of attention, affect, orientation, touch, and composite facial-visual engagement. Time-series approaches generated self- and interactive contingency estimates in each modality. Evidence supporting the following principles was obtained: (a) Significant moment-to-moment predictability within each partner (self-contingency) and between the partners (interactive contingency) characterizes mother-infant communication. (b) Interactive contingency is organized by a bidirectional, but asymmetrical, process: Maternal contingent coordination with infant is higher than infant contingent coordination with mother. (c) Self-contingency organizes communication to a far greater extent than interactive contingency. (d) Self- and interactive contingency processes are not separate; each affects the other in communication modalities of facial affect, facial-visual engagement, and orientation. Each person's self-organization exists in a dynamic, homoeostatic (negative feedback) balance with the degree to which the person coordinates with the partner. For example, those individuals who are less facially stable are likely to coordinate more strongly with the partner's facial affect and vice versa. Our findings support the concept that the dyad is a fundamental unit of analysis in the investigation of early interaction. Moreover, an individual's self-contingency is influenced by the way the individual coordinates with the partner. Our results imply that it is not appropriate to conceptualize interactive processes without simultaneously accounting for dynamically interrelated self-organizing processes. (c) 2016 APA, all rights reserved).
A Systems View of Mother-Infant Face-to-Face Communication
Beebe, Beatrice; Messinger, Daniel; Bahrick, Lorraine E.; Margolis, Amy; Buck, Karen A.; Chen, Henian
2016-01-01
Principles of a dynamic, dyadic systems view of mother-infant face-to-face communication, which considers self- and interactive processes in relation to one another, were tested. We examined the process of interaction across time in a large, low-risk community sample, at infant age 4 months. Split-screen videotape was coded on a 1-s time base for communication modalities of attention, affect, orientation, touch and composite facial-visual engagement. Time-series approaches generated self- and interactive contingency estimates in each modality. Evidence supporting the following principles was obtained: (1) Significant moment-to-moment predictability within each partner (self-contingency) and between the partners (interactive contingency) characterizes mother-infant communication. (2) Interactive contingency is organized by a bi-directional, but asymmetrical, process: maternal contingent coordination with infant is higher than infant contingent coordination with mother. (3) Self-contingency organizes communication to a far greater extent than interactive contingency. (4) Self-and interactive contingency processes are not separate; each affects the other, in communication modalities of facial affect, facial-visual engagement, and orientation. Each person’s self-organization exists in a dynamic, homoeostatic (negative feedback) balance with the degree to which the person coordinates with the partner. For example, those individuals who are less facially stable are likely to coordinate more strongly with the partner’s facial affect; and vice-versa. Our findings support the concept that the dyad is a fundamental unit of analysis in the investigation of early interaction. Moreover, an individual’s self-contingency is influenced by the way the individual coordinates with the partner. Our results imply that it is not appropriate to conceptualize interactive processes without simultaneously accounting for dynamically inter-related self-organizing processes. PMID:26882118
On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information
NASA Astrophysics Data System (ADS)
Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.
Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.
The Facially Disfigured Child.
ERIC Educational Resources Information Center
Moncada, Georgia A.
1987-01-01
The article reviews diagnosis and treatments for facially disfigured children including craniofacial reconstruction and microsurgical techniques. Noted are associated disease processes that affect the social and intellectual outcomes of the afflicted child. (Author/DB)
Right Hemispheric Dominance in Processing of Unconscious Negative Emotion
ERIC Educational Resources Information Center
Sato, Wataru; Aoki, Satoshi
2006-01-01
Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R
2015-01-01
Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes.
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R.
2015-01-01
Scherer’s Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes. PMID:26295338
Schwab, Daniela; Schienle, Anne
2018-06-01
The present event-related potential (ERP) study investigated for the first time whether children with early-onset social anxiety disorder (SAD) process affective facial expressions of varying intensities differently than non-anxious controls. Participants were 15 SAD patients and 15 non-anxious controls (mean age of 9 years). They were presented with schematic faces displaying anger and happiness at four intensity levels (25%, 50%, 75%, and 100%), as well as with neutral faces. ERPs in early and later time windows (P100, N170, late positivity [LP]), as well as affective ratings (valence and arousal) for the faces, were recorded. SAD patients rated the faces as generally more arousing, regardless of the type of emotion and intensity. Moreover, they displayed enhanced right-parietal LP (350-650 ms). Both arousal ratings and LP reflect stimulus intensity. Therefore, this study provides first evidence of an intensity amplification bias in pediatric SAD during facial affect processing.
Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm
Clayson, Peter E.; Larson, Michael J.
2013-01-01
The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words. PMID:24073278
Adaptation to emotional conflict: evidence from a novel face emotion paradigm.
Clayson, Peter E; Larson, Michael J
2013-01-01
The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.
Sex, Sexual Orientation, and Identification of Positive and Negative Facial Affect
ERIC Educational Resources Information Center
Rahman, Qazi; Wilson, Glenn D.; Abrahams, Sharon
2004-01-01
Sex and sexual orientation related differences in processing of happy and sad facial emotions were examined using an experimental facial emotion recognition paradigm with a large sample (N=240). Analysis of covariance (controlling for age and IQ) revealed that women (irrespective of sexual orientation) had faster reaction times than men for…
Deficits in facial affect recognition among antisocial populations: a meta-analysis.
Marsh, Abigail A; Blair, R J R
2008-01-01
Individuals with disorders marked by antisocial behavior frequently show deficits in recognizing displays of facial affect. Antisociality may be associated with specific deficits in identifying fearful expressions, which would implicate dysfunction in neural structures that subserve fearful expression processing. A meta-analysis of 20 studies was conducted to assess: (a) if antisocial populations show any consistent deficits in recognizing six emotional expressions; (b) beyond any generalized impairment, whether specific fear recognition deficits are apparent; and (c) if deficits in fear recognition are a function of task difficulty. Results show a robust link between antisocial behavior and specific deficits in recognizing fearful expressions. This impairment cannot be attributed solely to task difficulty. These results suggest dysfunction among antisocial individuals in specified neural substrates, namely the amygdala, involved in processing fearful facial affect.
Holistic Processing of Static and Moving Faces
ERIC Educational Resources Information Center
Zhao, Mintao; Bülthoff, Isabelle
2017-01-01
Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability--holistic face processing--remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based…
Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.
van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André
2016-01-01
Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.
Perceptual and affective mechanisms in facial expression recognition: An integrative review.
Calvo, Manuel G; Nummenmaa, Lauri
2016-09-01
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.
Neural bases of different cognitive strategies for facial affect processing in schizophrenia.
Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier
2008-03-01
To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.
Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea
2017-07-14
Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.
Neural mechanism for judging the appropriateness of facial affect.
Kim, Ji-Woong; Kim, Jae-Jin; Jeong, Bum Seok; Ki, Seon Wan; Im, Dong-Mi; Lee, Soo Jung; Lee, Hong Shick
2005-12-01
Questions regarding the appropriateness of facial expressions in particular situations arise ubiquitously in everyday social interactions. To determine the appropriateness of facial affect, first of all, we should represent our own or the other's emotional state as induced by the social situation. Then, based on these representations, we should infer the possible affective response of the other person. In this study, we identified the brain mechanism mediating special types of social evaluative judgments of facial affect in which the internal reference is related to theory of mind (ToM) processing. Many previous ToM studies have used non-emotional stimuli, but, because so much valuable social information is conveyed through nonverbal emotional channels, this investigation used emotionally salient visual materials to tap ToM. Fourteen right-handed healthy subjects volunteered for our study. We used functional magnetic resonance imaging to examine brain activation during the judgmental task for the appropriateness of facial affects as opposed to gender matching tasks. We identified activation of a brain network, which includes both medial frontal cortex, left temporal pole, left inferior frontal gyrus, and left thalamus during the judgmental task for appropriateness of facial affect compared to the gender matching task. The results of this study suggest that the brain system involved in ToM plays a key role in judging the appropriateness of facial affect in an emotionally laden situation. In addition, our result supports that common neural substrates are involved in performing diverse kinds of ToM tasks irrespective of perceptual modalities and the emotional salience of test materials.
Processing of subliminal facial expressions of emotion: a behavioral and fMRI study.
Prochnow, D; Kossack, H; Brunheim, S; Müller, K; Wittsack, H-J; Markowitsch, H-J; Seitz, R J
2013-01-01
The recognition of emotional facial expressions is an important means to adjust behavior in social interactions. As facial expressions widely differ in their duration and degree of expressiveness, they often manifest with short and transient expressions below the level of awareness. In this combined behavioral and fMRI study, we aimed at examining whether or not consciously accessible (subliminal) emotional facial expressions influence empathic judgments and which brain activations are related to it. We hypothesized that subliminal facial expressions of emotions masked with neutral expressions of the same faces induce an empathic processing similar to consciously accessible (supraliminal) facial expressions. Our behavioral data in 23 healthy subjects showed that subliminal emotional facial expressions of 40 ms duration affect the judgments of the subsequent neutral facial expressions. In the fMRI study in 12 healthy subjects it was found that both, supra- and subliminal emotional facial expressions shared a widespread network of brain areas including the fusiform gyrus, the temporo-parietal junction, and the inferior, dorsolateral, and medial frontal cortex. Compared with subliminal facial expressions, supraliminal facial expressions led to a greater activation of left occipital and fusiform face areas. We conclude that masked subliminal emotional information is suited to trigger processing in brain areas which have been implicated in empathy and, thereby in social encounters.
Kim, Do-Won; Kim, Han-Sung; Lee, Seung-Hwan; Im, Chang-Hwan
2013-12-01
Schizophrenia is one of the most devastating of all mental illnesses, and has dimensional characteristics that include both positive and negative symptoms. One problem reported in schizophrenia patients is that they tend to show deficits in face emotion processing, on which negative symptoms are thought to have stronger influence. In this study, four event-related potential (ERP) components (P100, N170, N250, and P300) and their source activities were analyzed using EEG data acquired from 23 schizophrenia patients while they were presented with facial emotion picture stimuli. Correlations between positive and negative syndrome scale (PANSS) scores and source activations during facial emotion processing were calculated to identify the brain areas affected by symptom scores. Our analysis demonstrates that PANSS positive scores are negatively correlated with major areas of the left temporal lobule for early ERP components (P100, N170) and with the right middle frontal lobule for a later component (N250), which indicates that positive symptoms affect both early face processing and facial emotion processing. On the other hand, PANSS negative scores are negatively correlated with several clustered regions, including the left fusiform gyrus (at P100), most of which are not overlapped with regions showing correlations with PANSS positive scores. Our results suggest that positive and negative symptoms affect independent brain regions during facial emotion processing, which may help to explain the heterogeneous characteristics of schizophrenia. © 2013 Elsevier B.V. All rights reserved.
Hindocha, Chandni; Freeman, Tom P; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K; Morgan, Celia J A; Curran, H Valerie
2015-03-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8mg), CBD (16mg), THC+CBD (8mg+16mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling 'stoned' was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being 'stoned'. CBD did not influence feelings of 'stoned'. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Hindocha, Chandni; Freeman, Tom P.; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K.; Morgan, Celia J.A.; Curran, H. Valerie
2015-01-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8 mg), CBD (16 mg), THC+CBD (8 mg+16 mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling ‘stoned’ was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being ‘stoned’. CBD did not influence feelings of ‘stoned’. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. PMID:25534187
Drug effects on responses to emotional facial expressions: recent findings.
Miller, Melissa A; Bershad, Anya K; de Wit, Harriet
2015-09-01
Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.
Bedi, Gillinder; Shiffrin, Laura; Vadhan, Nehal P; Nunes, Edward V; Foltin, Richard W; Bisaga, Adam
2016-04-01
In addition to difficulties in daily social functioning, regular cocaine users have decrements in social processing (the cognitive and affective processes underlying social behavior) relative to non-users. Little is known, however, about the effects of clinically-relevant pharmacological agents, such as cocaine and potential treatment medications, on social processing in cocaine users. Such drug effects could potentially alleviate or compound baseline social processing decrements in cocaine abusers. Here, we assessed the individual and combined effects of smoked cocaine and a potential treatment medication, levodopa-carbidopa-entacapone (LCE), on facial emotion recognition in cocaine smokers. Healthy non-treatment-seeking cocaine smokers (N = 14; two female) completed this 11-day inpatient within-subjects study. Participants received LCE (titrated to 400mg/100mg/200mg b.i.d.) for five days with the remaining time on placebo. The order of medication administration was counterbalanced. Facial emotion recognition was measured twice during target LCE dosing and twice on placebo: once without cocaine and once after repeated cocaine doses. LCE increased the response threshold for identification of facial fear, biasing responses away from fear identification. Cocaine had no effect on facial emotion recognition. Results highlight the possibility for candidate pharmacotherapies to have unintended impacts on social processing in cocaine users, potentially exacerbating already existing difficulties in this population. © The Author(s) 2016.
ERIC Educational Resources Information Center
Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise
2012-01-01
The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…
ERIC Educational Resources Information Center
Magnee, Maurice J. C. M.; de Gelder, Beatrice; van Engeland, Herman; Kemner, Chantal
2007-01-01
Background: Despite extensive research, it is still debated whether impairments in social skills of individuals with pervasive developmental disorder (PDD) are related to specific deficits in the early processing of emotional information. We aimed to test both automatic processing of facial affect as well as the integration of auditory and visual…
Romani, Maria; Vigliante, Miriam; Faedda, Noemi; Rossetti, Serena; Pezzuti, Lina; Guidetti, Vincenzo; Cardona, Francesco
2018-06-01
This review focuses on facial recognition abilities in children and adolescents with attention deficit hyperactivity disorder (ADHD). A systematic review, using PRISMA guidelines, was conducted to identify original articles published prior to May 2017 pertaining to memory, face recognition, affect recognition, facial expression recognition and recall of faces in children and adolescents with ADHD. The qualitative synthesis based on different studies shows a particular focus of the research on facial affect recognition without paying similar attention to the structural encoding of facial recognition. In this review, we further investigate facial recognition abilities in children and adolescents with ADHD, providing synthesis of the results observed in the literature, while detecting face recognition tasks used on face processing abilities in ADHD and identifying aspects not yet explored. Copyright © 2018 Elsevier Ltd. All rights reserved.
Neural measures of the role of affective prosody in empathy for pain.
Meconi, Federica; Doro, Mattia; Lomoriello, Arianna Schiano; Mastrella, Giulia; Sessa, Paola
2018-01-10
Emotional communication often needs the integration of affective prosodic and semantic components from speech and the speaker's facial expression. Affective prosody may have a special role by virtue of its dual-nature; pre-verbal on one side and accompanying semantic content on the other. This consideration led us to hypothesize that it could act transversely, encompassing a wide temporal window involving the processing of facial expressions and semantic content expressed by the speaker. This would allow powerful communication in contexts of potential urgency such as witnessing the speaker's physical pain. Seventeen participants were shown with faces preceded by verbal reports of pain. Facial expressions, intelligibility of the semantic content of the report (i.e., participants' mother tongue vs. fictional language) and the affective prosody of the report (neutral vs. painful) were manipulated. We monitored event-related potentials (ERPs) time-locked to the onset of the faces as a function of semantic content intelligibility and affective prosody of the verbal reports. We found that affective prosody may interact with facial expressions and semantic content in two successive temporal windows, supporting its role as a transverse communication cue.
Monkeys preferentially process body information while viewing affective displays.
Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J
2017-08-01
Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Saneiro, Mar; Salmeron-Majadas, Sergio
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support. PMID:24892055
Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.
Corrugator activity confirms immediate negative affect in surprise
Topolinski, Sascha; Strack, Fritz
2015-01-01
The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for this switch in operating mode. Surprising stimuli are schema-discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by surprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low surprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect) and frontalis (cultural surprise expression) activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes. PMID:25762956
Corrugator activity confirms immediate negative affect in surprise.
Topolinski, Sascha; Strack, Fritz
2015-01-01
The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for this switch in operating mode. Surprising stimuli are schema-discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by surprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low surprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect) and frontalis (cultural surprise expression) activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes.
Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N
2015-06-01
The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Implicit and explicit processing of emotional facial expressions in Parkinson's disease.
Wagenbreth, Caroline; Wattenberg, Lena; Heinze, Hans-Jochen; Zaehle, Tino
2016-04-15
Besides motor problems, Parkinson's disease (PD) is associated with detrimental emotional and cognitive functioning. Deficient explicit emotional processing has been observed, whilst patients also show impaired Theory of Mind (ToM) abilities. However, it is unclear whether this PD patients' ToM deficit is based on an inability to infer otherś emotional states or whether it is due to explicit emotional processing deficits. We investigated implicit and explicit emotional processing in PD with an affective priming paradigm in which we used pictures of human eyes for emotional primes and a lexical decision task (LDT) with emotional connoted words for target stimuli. Sixteen PD patients and sixteen matched healthy controls performed a LTD combined with an emotional priming paradigm providing emotional information through the facial eye region to assess implicit emotional processing. Second, participants explicitly evaluated the emotional status of eyes and words used in the implicit task. Compared to controls implicit emotional processing abilities were generally preserved in PD with, however, considerable alterations for happiness and disgust processing. Furthermore, we observed a general impairment of patients for explicit evaluation of emotional stimuli, which was augmented for the rating of facial expressions. This is the first study reporting results for affective priming with facial eye expressions in PD patients. Our findings indicate largely preserved implicit emotional processing, with a specific altered processing of disgust and happiness. Explicit emotional processing was considerably impaired for semantic and especially for facial stimulus material. Poor ToM abilities in PD patients might be based on deficient explicit emotional processing, with preserved ability to implicitly infer other people's feelings. Copyright © 2016 Elsevier B.V. All rights reserved.
Distinct facial processing in schizophrenia and schizoaffective disorders
Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost
2011-01-01
Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199
Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?
Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K
2017-12-01
Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.
Drug effects on responses to emotional facial expressions: recent findings
Miller, Melissa A.; Bershad, Anya K.; de Wit, Harriet
2016-01-01
Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally. PMID:26226144
Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing
Wieser, Matthias J.; Brosch, Tobias
2012-01-01
Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research. PMID:23130011
Wieser, Matthias J; Brosch, Tobias
2012-01-01
Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.
Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N
2017-06-01
Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.
Psychocentricity and participant profiles: implications for lexical processing among multilinguals
Libben, Gary; Curtiss, Kaitlin; Weber, Silke
2014-01-01
Lexical processing among bilinguals is often affected by complex patterns of individual experience. In this paper we discuss the psychocentric perspective on language representation and processing, which highlights the centrality of individual experience in psycholinguistic experimentation. We discuss applications to the investigation of lexical processing among multilinguals and explore the advantages of using high-density experiments with multilinguals. High density experiments are designed to co-index measures of lexical perception and production, as well as participant profiles. We discuss the challenges associated with the characterization of participant profiles and present a new data visualization technique, that we term Facial Profiles. This technique is based on Chernoff faces developed over 40 years ago. The Facial Profile technique seeks to overcome some of the challenges associated with the use of Chernoff faces, while maintaining the core insight that recoding multivariate data as facial features can engage the human face recognition system and thus enhance our ability to detect and interpret patterns within multivariate datasets. We demonstrate that Facial Profiles can code participant characteristics in lexical processing studies by recoding variables such as reading ability, speaking ability, and listening ability into iconically-related relative sizes of eye, mouth, and ear, respectively. The balance of ability in bilinguals can be captured by creating composite facial profiles or Janus Facial Profiles. We demonstrate the use of Facial Profiles and Janus Facial Profiles in the characterization of participant effects in the study of lexical perception and production. PMID:25071614
[Negative symptoms, emotion and cognition in schizophrenia].
Fakra, E; Belzeaux, R; Azorin, J-M; Adida, M
2015-12-01
For a long time, treatment of schizophrenia has been essentially focussed on positive symptoms managing. Yet, even if these symptoms are the most noticeable, negative symptoms are more enduring, resistant to pharmacological treatment and associated with a worse prognosis. In the two last decades, attention has shift towards cognitive deficit, as this deficit is most robustly associated to functional outcome. But it appears that the modest improvement in cognition, obtained in schizophrenia through pharmacological treatment or, more purposely, by cognitive enhancement therapy, has only lead to limited amelioration of functional outcome. Authors have claimed that pure cognitive processes, such as those evaluated and trained in lots of these programs, may be too distant from real-life conditions, as the latter are largely based on social interactions. Consequently, the field of social cognition, at the interface of cognition and emotion, has emerged. In a first part of this article we examined the links, in schizophrenia, between negative symptoms, cognition and emotions from a therapeutic standpoint. Nonetheless, investigation of emotion in schizophrenia may also hold relevant premises for understanding the physiopathology of this disorder. In a second part, we propose to illustrate this research by relying on the heuristic value of an elementary marker of social cognition, facial affect recognition. Facial affect recognition has been repeatedly reported to be impaired in schizophrenia and some authors have argued that this deficit could constitute an endophenotype of the illness. We here examined how facial affect processing has been used to explore broader emotion dysfunction in schizophrenia, through behavioural and imaging studies. In particular, fMRI paradigms using facial affect have shown particular patterns of amygdala engagement in schizophrenia, suggesting an intact potential to elicit the limbic system which may however not be advantageous. Finally, we analysed facial affect processing on a cognitive-perceptual level, and the aptitude in schizophrenia to manipulate featural and configural information in faces. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Balconi, Michela; Lucchiari, Claudio
2005-02-01
Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.
Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.
Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha
2015-04-01
According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Larra, Mauro F.; Merz, Martina U.; Schächinger, Hartmut
2017-01-01
Facial self-resemblance has been associated with positive emotional evaluations, but this effect may be biased by self-face familiarity. Here we report two experiments utilizing startle modulation to investigate how the processing of facial expressions of emotion is affected by subtle resemblance to the self as well as to familiar faces. Participants of the first experiment (I) (N = 39) were presented with morphed faces showing happy, neutral, and fearful expressions which were manipulated to resemble either their own or unknown faces. At SOAs of either 300 ms or 3500–4500 ms after picture onset, startle responses were elicited by binaural bursts of white noise (50 ms, 105 dB), and recorded at the orbicularis oculi via EMG. Manual reaction time was measured in a simple emotion discrimination paradigm. Pictures preceding noise bursts by short SOA inhibited startle (prepulse inhibition, PPI). Both affective modulation and PPI of startle in response to emotional faces was altered by physical similarity to the self. As indexed both by relative facilitation of startle and faster manual responses, self-resemblance apparently induced deeper processing of facial affect, particularly in happy faces. Experiment II (N = 54) produced similar findings using morphs of famous faces, yet showed no impact of mere familiarity on PPI effects (or response time, either). The results are discussed with respect to differential (presumably pre-attentive) effects of self-specific vs. familiar information in face processing. PMID:29216226
Lee, Junghee; Kern, Robert S.; Harvey, Philippe-Olivier; Horan, William P.; Kee, Kimmy S.; Ochsner, Kevin; Penn, David L.; Green, Michael F.
2013-01-01
Background Impaired facial affect recognition is the most consistent social cognitive finding in schizophrenia. Although social situations provide powerful constraints on our perception, little is known about how situational context modulates facial affect recognition in schizophrenia. Methods Study 1 was a single-site study with 34 schizophrenia patients and 22 healthy controls. Study 2 was a 2-site study with 68 schizophrenia patients and 28 controls. Both studies administered a Situational Context Facial Affect Recognition Task with 2 conditions: a situational context condition and a no-context condition. For the situational context condition, a briefly shown face was preceded by a sentence describing either a fear- or surprise-inducing event. In the no-context condition, a face was presented without a sentence. For both conditions, subjects rated how fearful or surprised the face appeared on a 9-point Likert scale. Results For the situational context condition of study 1, both patients and controls rated faces as more afraid when they were paired with fear-inducing sentences and as more surprised when they were paired with surprise-inducing sentences. The degree of modulation was comparable across groups. For the no-context condition, patients rated faces comparably to controls. The findings of study 2 replicated those from study 1. Conclusions Despite previous abnormalities in other types of context paradigms, this study found intact situational context processing in schizophrenia, suggesting that patients benefit from situational context when interpreting ambiguous facial expression. This area of relative social cognitive strength in schizophrenia has implications for social cognitive training programs. PMID:22532704
Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei
2016-01-01
Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.
Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei
2016-01-01
Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604
Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.
Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus
2013-12-01
Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.
Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J
2018-03-14
The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.
Lee, W J; Won, K H; Won, C H; Chang, S E; Choi, J H; Moon, K C; Lee, M W
2014-05-01
Although more than 300 cases of eosinophilic pustular folliculitis (EPF) have been reported to date, differences in clinicohistopathological findings among affected sites have not yet been evaluated. To evaluate differences in the clinical and histopathological features of facial and extrafacial EPF. Forty-six patients diagnosed with EPF were classified into those with facial and extrafacial disease according to the affected site. Clinical and histopathological characteristics were retrospectively compared, using all data available in the patient medical records. There were no significant between-group differences in subject ages at presentation, but a male predominance was observed in the extrafacial group. In addition, immunosuppression-associated type EPF was more common in the extrafacial group. Eruptions of plaques with an annular appearance were more common in the facial group. Histologically, perifollicular infiltration of eosinophils occurred more frequently in the facial group, whereas perivascular patterns occurred more frequently in the extrafacial group. Follicular mucinosis and exocytosis of inflammatory cells in the hair follicles were strongly associated with facial EPF. The clinical and histopathological characteristics of patients with facial and extrafacial EPF differ, suggesting the involvement of different pathogenic processes in the development of EPF at different sites. © 2013 British Association of Dermatologists.
CACNA1C risk variant affects facial emotion recognition in healthy individuals.
Nieratschker, Vanessa; Brückmann, Christof; Plewnia, Christian
2015-11-27
Recognition and correct interpretation of facial emotion is essential for social interaction and communication. Previous studies have shown that impairments in this cognitive domain are common features of several psychiatric disorders. Recent association studies identified CACNA1C as one of the most promising genetic risk factors for psychiatric disorders and previous evidence suggests that the most replicated risk variant in CACNA1C (rs1006737) is affecting emotion recognition and processing. However, studies investigating the influence of rs1006737 on this intermediate phenotype in healthy subjects at the behavioral level are largely missing to date. Here, we applied the "Reading the Mind in the Eyes" test, a facial emotion recognition paradigm in a cohort of 92 healthy individuals to address this question. Whereas accuracy was not affected by genotype, CACNA1C rs1006737 risk-allele carries (AA/AG) showed significantly slower mean response times compared to individuals homozygous for the G-allele, indicating that healthy risk-allele carriers require more information to correctly identify a facial emotion. Our study is the first to provide evidence for an impairing behavioral effect of the CACNA1C risk variant rs1006737 on facial emotion recognition in healthy individuals and adds to the growing number of studies pointing towards CACNA1C as affecting intermediate phenotypes of psychiatric disorders.
Nentjes, Lieke; Bernstein, David P; Meijer, Ewout; Arntz, Arnoud; Wiers, Reinout W
2016-12-01
This study investigated the physiological, self-reported, and facial correlates of emotion regulation in psychopathy. Specifically, we compared psychopathic offenders (n = 42), nonpsychopathic offenders (n = 42), and nonoffender controls (n = 26) in their ability to inhibit and express emotion while watching affective films (fear, happy, and sad). Results showed that all participants were capable of drastically diminishing facial emotions under inhibition instructions. Contrary to expectation, psychopaths were not superior in adopting such a "poker face." Further, the inhibition of emotion was associated with cardiovascular changes, an effect that was also not dependent on psychopathy (or its factors), suggesting emotion inhibition to be an effortful process in psychopaths as well. Interestingly, psychopathic offenders did not differ from nonpsychopaths in the capacity to show content-appropriate facial emotions during the expression condition. Taken together, these data challenge the view that psychopathy is associated with either superior emotional inhibitory capacities or a generalized impairment in showing facial affect.
Contextual interference processing during fast categorisations of facial expressions.
Frühholz, Sascha; Trautmann-Lengsfeld, Sina A; Herrmann, Manfred
2011-09-01
We examined interference effects of emotionally associated background colours during fast valence categorisations of negative, neutral and positive expressions. According to implicitly learned colour-emotion associations, facial expressions were presented with colours that either matched the valence of these expressions or not. Experiment 1 included infrequent non-matching trials and Experiment 2 a balanced ratio of matching and non-matching trials. Besides general modulatory effects of contextual features on the processing of facial expressions, we found differential effects depending on the valance of target facial expressions. Whereas performance accuracy was mainly affected for neutral expressions, performance speed was specifically modulated by emotional expressions indicating some susceptibility of emotional expressions to contextual features. Experiment 3 used two further colour-emotion combinations, but revealed only marginal interference effects most likely due to missing colour-emotion associations. The results are discussed with respect to inherent processing demands of emotional and neutral expressions and their susceptibility to contextual interference.
Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.
Pecchinenda, Anna; Petrucci, Manuel
2016-01-01
Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.
Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load
Petrucci, Manuel
2016-01-01
Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925
Soussignan, R; Jiang, T; Rigaud, D; Royet, J P; Schaal, B
2010-03-01
To investigate hedonic reactivity and the influence of unconscious emotional processes on the low sensitivity to positive reinforcement of food in anorexia nervosa (AN). AN and healthy women were exposed to palatable food pictures just after a subliminal exposure to facial expressions (happy, disgust, fear and neutral faces), either while fasting or after a standardized meal (hunger versus satiety). Both implicit [facial electromyographic (EMG) activity from zygomatic and corrugator muscles, skin conductance, heart rate, and videotaped facial behavior] and explicit (self-reported pleasure and desire) measures of affective processes were recorded. In contrast to healthy women, the AN patients did not display objective and subjective indices of pleasure to food pictures when they were in the hunger states. Pleasure to food cues (liking) was more affected than the desire to eat (wanting) in AN patients. Subliminal 'fear faces' increased corrugator muscle reactivity to food stimuli in fasting AN patients, as compared to controls. The results suggest that unconscious fear cues increase the negative appraisal of alimentary stimuli in AN patients and thus contribute to decreased energy intake.
Ho, Tiffany C; Zhang, Shunan; Sacchet, Matthew D; Weng, Helen; Connolly, Colm G; Henje Blom, Eva; Han, Laura K M; Mobayed, Nisreen O; Yang, Tony T
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.
Ho, Tiffany C.; Zhang, Shunan; Sacchet, Matthew D.; Weng, Helen; Connolly, Colm G.; Henje Blom, Eva; Han, Laura K. M.; Mobayed, Nisreen O.; Yang, Tony T.
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed. PMID:26869950
Key, Alexandra P.; Ibanez, Lisa V.; Henderson, Heather A.; Warren, Zachary; Messinger, Daniel S.; Stone, Wendy L.
2014-01-01
Few behavioral indices of risk for autism spectrum disorders (ASD) are present before 12 months, and potential biomarkers remain largely unexamined. This prospective study of infant siblings of children with ASD (n=16) and low-risk comparison infants (n= 15) examined group differences in event-related potentials (ERPs) indexing processing of facial positive affect (N290/P400, Nc) at 9 months and their relation to joint attention at 15 months. Group differences were most pronounced for subtle facial expressions, in that the low-risk group exhibited relatively longer processing (P400 latency) and greater attention resource allocation (Nc amplitude). Exploratory analyses found associations between ERP responses and later joint attention, suggesting that attention to positive affect cues may support the development of other social competencies. PMID:25056131
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R
2014-04-01
Componential theories assume that emotion episodes consist of emergent and dynamic response changes to relevant events in different components, such as appraisal, physiology, motivation, expression, and subjective feeling. In particular, Scherer's Component Process Model hypothesizes that subjective feeling emerges when the synchronization (or coherence) of appraisal-driven changes between emotion components has reached a critical threshold. We examined the prerequisite of this synchronization hypothesis for appraisal-driven response changes in facial expression. The appraisal process was manipulated by using feedback stimuli, presented in a gambling task. Participants' responses to the feedback were investigated in concurrently recorded brain activity related to appraisal (event-related potentials, ERP) and facial muscle activity (electromyography, EMG). Using principal component analysis, the prediction of appraisal-driven response changes in facial EMG was examined. Results support this prediction: early cognitive processes (related to the feedback-related negativity) seem to primarily affect the upper face, whereas processes that modulate P300 amplitudes tend to predominantly drive cheek region responses. Copyright © 2013 Elsevier B.V. All rights reserved.
Lelli-Chiesa, G; Kempton, M J; Jogia, J; Tatarelli, R; Girardi, P; Powell, J; Collier, D A; Frangou, S
2011-04-01
The Met allele of the catechol-O-methyltransferase (COMT) valine-to-methionine (Val158Met) polymorphism is known to affect dopamine-dependent affective regulation within amygdala-prefrontal cortical (PFC) networks. It is also thought to increase the risk of a number of disorders characterized by affective morbidity including bipolar disorder (BD), major depressive disorder (MDD) and anxiety disorders. The disease risk conferred is small, suggesting that this polymorphism represents a modifier locus. Therefore our aim was to investigate how the COMT Val158Met may contribute to phenotypic variation in clinical diagnosis using sad facial affect processing as a probe for its neural action. We employed functional magnetic resonance imaging to measure activation in the amygdala, ventromedial PFC (vmPFC) and ventrolateral PFC (vlPFC) during sad facial affect processing in family members with BD (n=40), MDD and anxiety disorders (n=22) or no psychiatric diagnosis (n=25) and 50 healthy controls. Irrespective of clinical phenotype, the Val158 allele was associated with greater amygdala activation and the Met158 allele with greater signal change in the vmPFC and vlPFC. Signal changes in the amygdala and vmPFC were not associated with disease expression. However, in the right vlPFC the Met158 allele was associated with greater activation in all family members with affective morbidity compared with relatives without a psychiatric diagnosis and healthy controls. Our results suggest that the COMT Val158Met polymorphism has a pleiotropic effect within the neural networks subserving emotional processing. Furthermore the Met158 allele further reduces cortical efficiency in the vlPFC in individuals with affective morbidity.
Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E
2011-02-01
The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.
Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya
2016-01-01
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.
Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya
2016-01-01
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like “automatic” response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target’s facial expressions depends on whether participants are motivated to infer the target’s emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target’s emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target’s emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target’s emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes. PMID:27055206
Time course of implicit processing and explicit processing of emotional faces and emotional words.
Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred
2011-05-01
Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity. Copyright © 2011 Elsevier B.V. All rights reserved.
Frühholz, Sascha; Fehr, Thorsten; Herrmann, Manfred
2009-10-01
Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.
Differential priming effect for subliminal fear and disgust facial expressions.
Lee, Su Young; Kang, Jee In; Lee, Eun; Namkoong, Kee; An, Suk Kyoon
2011-02-01
Compared to neutral or happy stimuli, subliminal fear stimuli are known to be well processed through the automatic pathway. We tried to examine whether fear stimuli could be processed more strongly than other negative emotional stimuli using a modified subliminal affective priming paradigm. Twenty-six healthy subjects participated in two separated sessions. Fear, disgust and neutral facial expressions were adopted as primes, and 50% happy facial stimuli were adopted as a target to let only stronger negative primes reveal a priming effect. Participants were asked to appraise the affect of target faces in the affect appraisal session and to appraise the genuineness of target faces in the genuineness appraisal session. The genuineness instruction was developed to help participants be sensitive to potential threats. In the affect appraisal, participants judged 50% happy target faces significantly more 'unpleasant' when they were primed by fear faces than primed by 50% happy control faces. In the genuineness appraisal, participants judged targets significantly more 'not genuine' when they were primed by fear and disgust faces than primed by controls. These findings suggest that there may be differential priming effects between subliminal fear and disgust expressions, which could be modulated by a sensitive context of potential threat.
Wojciechowski, Jerzy; Stolarski, Maciej; Matthews, Gerald
2014-01-01
Processing facial emotion, especially mismatches between facial and verbal messages, is believed to be important in the detection of deception. For example, emotional leakage may accompany lying. Individuals with superior emotion perception abilities may then be more adept in detecting deception by identifying mismatch between facial and verbal messages. Two personal factors that may predict such abilities are female gender and high emotional intelligence (EI). However, evidence on the role of gender and EI in detection of deception is mixed. A key issue is that the facial processing skills required to detect deception may not be the same as those required to identify facial emotion. To test this possibility, we developed a novel facial processing task, the FDT (Face Decoding Test) that requires detection of inconsistencies between facial and verbal cues to emotion. We hypothesized that gender and ability EI would be related to performance when cues were inconsistent. We also hypothesized that gender effects would be mediated by EI, because women tend to score as more emotionally intelligent on ability tests. Data were collected from 210 participants. Analyses of the FDT suggested that EI was correlated with superior face decoding in all conditions. We also confirmed the expected gender difference, the superiority of high EI individuals, and the mediation hypothesis. Also, EI was more strongly associated with facial decoding performance in women than in men, implying there may be gender differences in strategies for processing affective cues. It is concluded that integration of emotional and cognitive cues may be a core attribute of EI that contributes to the detection of deception. PMID:24658500
Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis
2018-05-06
Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).
Yamada, Makiko; Decety, Jean
2009-05-01
Results from recent functional neuroimaging studies suggest that facial expressions of pain trigger empathic mimicry responses in the observer, in the sense of an activation in the pain matrix. However, pain itself also signals a potential threat in the environment and urges individuals to escape or avoid its source. This evolutionarily primitive aspect of pain processing, i.e., avoidance from the threat value of pain, seems to conflict with the emergence of empathic concern, i.e., a motivation to approach toward the other. The present study explored whether the affective values of targets influence the detection of pain at the unconscious level. We found that the detection of pain was facilitated by unconscious negative affective processing rather than by positive affective processing. This suggests that detection of pain is primarily influenced by its inherent threat value, and that empathy and empathic concern may not rely on a simple reflexive resonance as generally thought. The results of this study provide a deeper understanding of how fundamental the unconscious detection of pain is to the processes involved in the experience of empathy and sympathy.
Virtual Characters: Visual Realism Affects Response Time and Decision-Making
ERIC Educational Resources Information Center
Sibuma, Bernadette
2012-01-01
This study integrates agent research with a neurocognitive technique to study how character faces affect cognitive processing. The N170 event-related potential (ERP) was used to study face processing during simple decision-making tasks. Twenty-five adults responded to facial expressions (fear/neutral) presented in three designs…
Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying
2017-01-01
Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing directions for clinical treatment of mood disorders in attachment anxiety. PMID:28473796
Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying
2017-01-01
Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants' reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual's processing of positive emotional faces; for instance, the presentation of the partner's name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing directions for clinical treatment of mood disorders in attachment anxiety.
Effect of camera resolution and bandwidth on facial affect recognition.
Cruz, Mario; Cruz, Robyn Flaum; Krupinski, Elizabeth A; Lopez, Ana Maria; McNeeley, Richard M; Weinstein, Ronald S
2004-01-01
This preliminary study explored the effect of camera resolution and bandwidth on facial affect recognition, an important process and clinical variable in mental health service delivery. Sixty medical students and mental health-care professionals were recruited and randomized to four different combinations of commonly used teleconferencing camera resolutions and bandwidths: (1) one chip charged coupling device (CCD) camera, commonly used for VHSgrade taping and in teleconferencing systems costing less than $4,000 with a resolution of 280 lines, and 128 kilobytes per second bandwidth (kbps); (2) VHS and 768 kbps; (3) three-chip CCD camera, commonly used for Betacam (Beta) grade taping and in teleconferencing systems costing more than $4,000 with a resolution of 480 lines, and 128 kbps; and (4) Betacam and 768 kbps. The subjects were asked to identify four facial affects dynamically presented on videotape by an actor and actress presented via a video monitor at 30 frames per second. Two-way analysis of variance (ANOVA) revealed a significant interaction effect for camera resolution and bandwidth (p = 0.02) and a significant main effect for camera resolution (p = 0.006), but no main effect for bandwidth was detected. Post hoc testing of interaction means, using the Tukey Honestly Significant Difference (HSD) test and the critical difference (CD) at the 0.05 alpha level = 1.71, revealed subjects in the VHS/768 kbps (M = 7.133) and VHS/128 kbps (M = 6.533) were significantly better at recognizing the displayed facial affects than those in the Betacam/768 kbps (M = 4.733) or Betacam/128 kbps (M = 6.333) conditions. Camera resolution and bandwidth combinations differ in their capacity to influence facial affect recognition. For service providers, this study's results support the use of VHS cameras with either 768 kbps or 128 kbps bandwidths for facial affect recognition compared to Betacam cameras. The authors argue that the results of this study are a consequence of the VHS camera resolution/bandwidth combinations' ability to improve signal detection (i.e., facial affect recognition) by subjects in comparison to Betacam camera resolution/bandwidth combinations.
ERIC Educational Resources Information Center
Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara
2013-01-01
The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…
Residual fMRI sensitivity for identity changes in acquired prosopagnosia.
Fox, Christopher J; Iaria, Giuseppe; Duchaine, Bradley C; Barton, Jason J S
2013-01-01
While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception.
Residual fMRI sensitivity for identity changes in acquired prosopagnosia
Fox, Christopher J.; Iaria, Giuseppe; Duchaine, Bradley C.; Barton, Jason J. S.
2013-01-01
While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception. PMID:24151479
Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka
2014-01-01
Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.
Stereotypes and prejudice affect the recognition of emotional body postures.
Bijlstra, Gijsbert; Holland, Rob W; Dotsch, Ron; Wigboldus, Daniel H J
2018-03-26
Most research on emotion recognition focuses on facial expressions. However, people communicate emotional information through bodily cues as well. Prior research on facial expressions has demonstrated that emotion recognition is modulated by top-down processes. Here, we tested whether this top-down modulation generalizes to the recognition of emotions from body postures. We report three studies demonstrating that stereotypes and prejudice about men and women may affect how fast people classify various emotional body postures. Our results suggest that gender cues activate gender associations, which affect the recognition of emotions from body postures in a top-down fashion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Hepatitis Diagnosis Using Facial Color Image
NASA Astrophysics Data System (ADS)
Liu, Mingjia; Guo, Zhenhua
Facial color diagnosis is an important diagnostic method in traditional Chinese medicine (TCM). However, due to its qualitative, subjective and experi-ence-based nature, traditional facial color diagnosis has a very limited application in clinical medicine. To circumvent the subjective and qualitative problems of facial color diagnosis of Traditional Chinese Medicine, in this paper, we present a novel computer aided facial color diagnosis method (CAFCDM). The method has three parts: face Image Database, Image Preprocessing Module and Diagnosis Engine. Face Image Database is carried out on a group of 116 patients affected by 2 kinds of liver diseases and 29 healthy volunteers. The quantitative color feature is extracted from facial images by using popular digital image processing techni-ques. Then, KNN classifier is employed to model the relationship between the quantitative color feature and diseases. The results show that the method can properly identify three groups: healthy, severe hepatitis with jaundice and severe hepatitis without jaundice with accuracy higher than 73%.
Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira
2014-01-01
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162
Jusyte, Aiste; Schönenberg, Michael
2014-01-01
Socially anxious individuals have been shown to exhibit altered processing of facial affect, especially expressions signaling threat. Enhanced unaware processing has been suggested an important mechanism which may give rise to anxious conscious cognition and behavior. This study investigated whether individuals with social anxiety disorder (SAD) are perceptually more vulnerable to the biasing effects of subliminal threat cues compared to healthy controls. In a perceptual judgment task, 23 SAD and 23 matched control participants were asked to rate the affective valence of parametrically manipulated affective expressions ranging from neutral to angry. Each trial was preceded by subliminal presentation of an angry/neutral cue. The SAD group tended to rate target faces as “angry” when the preceding subliminal stimulus was angry vs. neutral, while healthy participants were not biased by the subliminal stimulus presentation. The perceptual bias in SAD was also associated with higher reaction time latencies in the subliminal angry cue condition. The results provide further support for enhanced unconscious threat processing in SAD individuals. The implications for etiology, maintenance, and treatment of SAD are discussed. PMID:25136307
Jusyte, Aiste; Schönenberg, Michael
2014-01-01
Socially anxious individuals have been shown to exhibit altered processing of facial affect, especially expressions signaling threat. Enhanced unaware processing has been suggested an important mechanism which may give rise to anxious conscious cognition and behavior. This study investigated whether individuals with social anxiety disorder (SAD) are perceptually more vulnerable to the biasing effects of subliminal threat cues compared to healthy controls. In a perceptual judgment task, 23 SAD and 23 matched control participants were asked to rate the affective valence of parametrically manipulated affective expressions ranging from neutral to angry. Each trial was preceded by subliminal presentation of an angry/neutral cue. The SAD group tended to rate target faces as "angry" when the preceding subliminal stimulus was angry vs. neutral, while healthy participants were not biased by the subliminal stimulus presentation. The perceptual bias in SAD was also associated with higher reaction time latencies in the subliminal angry cue condition. The results provide further support for enhanced unconscious threat processing in SAD individuals. The implications for etiology, maintenance, and treatment of SAD are discussed.
Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.
Nummenmaa, Lauri; Calvo, Manuel G
2015-04-01
Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).
Franzen, Jessica; Brinkmann, Kerstin
2016-12-01
Theories and research on depression point to reduced responsiveness during reward anticipation and in part also during punishment anticipation. They also suggest weaker affective responses to reward consumption and unchanged affective responses to punishment consumption. However, studies investigating incentive anticipation using effort mobilization and incentive consumption using facial expressions are scarce. The present studies tested reward and punishment responsiveness in a subclinically depressed sample, manipulating a monetary reward (Study 1) and a monetary punishment (Study 2). Effort mobilization was operationalized as cardiovascular reactivity, while facial expressions were measured by facial electromyographic reactivity. Compared to nondysphorics, dysphorics showed reduced pre-ejection period (PEP) reactivity and blunted self-reported wanting during reward anticipation but reduced PEP reactivity and normal self-reported wanting during punishment anticipation. Compared to nondysphorics, dysphorics showed reduced zygomaticus major muscle reactivity and blunted self-reported liking during reward consumption but normal corrugator supercilii muscle reactivity and normal self-reported disliking during punishment consumption. Copyright © 2016. Published by Elsevier B.V.
Stability of Facial Affective Expressions in Schizophrenia
Fatouros-Bergman, H.; Spang, J.; Merten, J.; Preisler, G.; Werbart, A.
2012-01-01
Thirty-two videorecorded interviews were conducted by two interviewers with eight patients diagnosed with schizophrenia. Each patient was interviewed four times: three weekly interviews by the first interviewer and one additional interview by the second interviewer. 64 selected sequences where the patients were speaking about psychotic experiences were scored for facial affective behaviour with Emotion Facial Action Coding System (EMFACS). In accordance with previous research, the results show that patients diagnosed with schizophrenia express negative facial affectivity. Facial affective behaviour seems not to be dependent on temporality, since within-subjects ANOVA revealed no substantial changes in the amount of affects displayed across the weekly interview occasions. Whereas previous findings found contempt to be the most frequent affect in patients, in the present material disgust was as common, but depended on the interviewer. The results suggest that facial affectivity in these patients is primarily dominated by the negative emotions of disgust and, to a lesser extent, contempt and implies that this seems to be a fairly stable feature. PMID:22966449
Facial emotion processing in pediatric social anxiety disorder: Relevance of situational context.
Schwab, Daniela; Schienle, Anne
2017-08-01
Social anxiety disorder (SAD) typically begins in childhood. Previous research has demonstrated that adult patients respond with elevated late positivity (LP) to negative facial expressions. In the present study on pediatric SAD, we investigated responses to negative facial expressions and the role of social context information. Fifteen children with SAD and 15 non-anxious controls were first presented with images of negative facial expressions with masked backgrounds. Following this, the complete images which included context information, were shown. The negative expressions were either a result of an emotion-relevant (e.g., social exclusion) or emotion-irrelevant elicitor (e.g., weight lifting). Relative to controls, the clinical group showed elevated parietal LP during face processing with and without context information. Both groups differed in their frontal LP depending on the type of context. In SAD patients, frontal LP was lower in emotion-relevant than emotion-irrelevant contexts. We conclude that SAD patients direct more automatic attention towards negative facial expressions (parietal effect) and are less capable in integrating affective context information (frontal effect). Copyright © 2017 Elsevier Ltd. All rights reserved.
Categorical Perception of Affective and Linguistic Facial Expressions
ERIC Educational Resources Information Center
McCullough, Stephen; Emmorey, Karen
2009-01-01
Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…
Foley, Elaine; Rippon, Gina; Thai, Ngoc Jade; Longe, Olivia; Senior, Carl
2012-02-01
Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223-233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.
Contributions of feature shapes and surface cues to the recognition of facial expressions.
Sormaz, Mladen; Young, Andrew W; Andrews, Timothy J
2016-10-01
Theoretical accounts of face processing often emphasise feature shapes as the primary visual cue to the recognition of facial expressions. However, changes in facial expression also affect the surface properties of the face. In this study, we investigated whether this surface information can also be used in the recognition of facial expression. First, participants identified facial expressions (fear, anger, disgust, sadness, happiness) from images that were manipulated such that they varied mainly in shape or mainly in surface properties. We found that the categorization of facial expression is possible in either type of image, but that different expressions are relatively dependent on surface or shape properties. Next, we investigated the relative contributions of shape and surface information to the categorization of facial expressions. This employed a complementary method that involved combining the surface properties of one expression with the shape properties from a different expression. Our results showed that the categorization of facial expressions in these hybrid images was equally dependent on the surface and shape properties of the image. Together, these findings provide a direct demonstration that both feature shape and surface information make significant contributions to the recognition of facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kohn, Nils; Fernández, Guillén
2017-12-06
Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Harmer, Catherine J; Shelley, Nicholas C; Cowen, Philip J; Goodwin, Guy M
2004-07-01
Antidepressants that inhibit the reuptake of serotonin (SSRIs) or norepinephrine (SNRIs) are effective in the treatment of disorders such as depression and anxiety. Cognitive psychological theories emphasize the importance of correcting negative biases of information processing in the nonpharmacological treatment of these disorders, but it is not known whether antidepressant drugs can directly modulate the neural processing of affective information. The present study therefore assessed the actions of repeated antidepressant administration on perception and memory for positive and negative emotional information in healthy volunteers. Forty-two male and female volunteers were randomly assigned to 7 days of double-blind intervention with the SSRI citalopram (20 mg/day), the SNRI reboxetine (8 mg/day), or placebo. On the final day, facial expression recognition, emotion-potentiated startle response, and memory for affect-laden words were assessed. Questionnaires monitoring mood, hostility, and anxiety were given before and after treatment. In the facial expression recognition task, citalopram and reboxetine reduced the identification of the negative facial expressions of anger and fear. Citalopram also abolished the increased startle response found in the context of negative affective images. Both antidepressants increased the relative recall of positive (versus negative) emotional material. These changes in emotional processing occurred in the absence of significant differences in ratings of mood and anxiety. However, reboxetine decreased subjective ratings of hostility and elevated energy. Short-term administration of two different antidepressant types had similar effects on emotion-related tasks in healthy volunteers, reducing the processing of negative relative to positive emotional material. Such effects of antidepressants may ameliorate the negative biases in information processing that characterize mood and anxiety disorders. They also suggest a mechanism of action potentially compatible with cognitive theories of anxiety and depression.
Retention interval affects visual short-term memory encoding.
Bankó, Eva M; Vidnyánszky, Zoltán
2010-03-01
Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.
Chen, Wenfeng; Liu, Chang Hong; Nakabayashi, Kazuyo
2012-01-01
Recent research has shown that the presence of a task-irrelevant attractive face can induce a transient diversion of attention from a perceptual task that requires covert deployment of attention to one of the two locations. However, it is not known whether this spontaneous appraisal for facial beauty also modulates attention in change detection among multiple locations, where a slower, and more controlled search process is simultaneously affected by the magnitude of a change and the facial distinctiveness. Using the flicker paradigm, this study examines how spontaneous appraisal for facial beauty affects the detection of identity change among multiple faces. Participants viewed a display consisting of two alternating frames of four faces separated by a blank frame. In half of the trials, one of the faces (target face) changed to a different person. The task of the participant was to indicate whether a change of face identity had occurred. The results showed that (1) observers were less efficient at detecting identity change among multiple attractive faces relative to unattractive faces when the target and distractor faces were not highly distinctive from one another; and (2) it is difficult to detect a change if the new face is similar to the old. The findings suggest that attractive faces may interfere with the attention-switch process in change detection. The results also show that attention in change detection was strongly modulated by physical similarity between the alternating faces. Although facial beauty is a powerful stimulus that has well-demonstrated priority, its influence on change detection is easily superseded by low-level image similarity. The visual system appears to take a different approach to facial beauty when a task requires resource-demanding feature comparisons.
Habibi, Ruth; Khurana, Beena
2012-01-01
Facial recognition is key to social interaction, however with unfamiliar faces only generic information, in the form of facial stereotypes such as gender and age is available. Therefore is generic information more prominent in unfamiliar versus familiar face processing? In order to address the question we tapped into two relatively disparate stages of face processing. At the early stages of encoding, we employed perceptual masking to reveal that only perception of unfamiliar face targets is affected by the gender of the facial masks. At the semantic end; using a priming paradigm, we found that while to-be-ignored unfamiliar faces prime lexical decisions to gender congruent stereotypic words, familiar faces do not. Our findings indicate that gender is a more salient dimension in unfamiliar relative to familiar face processing, both in early perceptual stages as well as later semantic stages of person construal. PMID:22389697
Holmes, Amanda; Winston, Joel S; Eimer, Martin
2005-10-01
To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.
Neural circuitry of emotional and cognitive conflict revealed through facial expressions.
Chiew, Kimberly S; Braver, Todd S
2011-03-09
Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.
Neural Circuitry of Emotional and Cognitive Conflict Revealed through Facial Expressions
Chiew, Kimberly S.; Braver, Todd S.
2011-01-01
Background Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Methodology/Principal Findings Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. Conclusions/Significance These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference. PMID:21408006
Facial Affect Recognition and Social Anxiety in Preschool Children
ERIC Educational Resources Information Center
Ale, Chelsea M.; Chorney, Daniel B.; Brice, Chad S.; Morris, Tracy L.
2010-01-01
Research relating anxiety and facial affect recognition has focused mostly on school-aged children and adults and has yielded mixed results. The current study sought to demonstrate an association among behavioural inhibition and parent-reported social anxiety, shyness, social withdrawal and facial affect recognition performance in 30 children,…
Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.
Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976
[Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].
Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel
2016-07-01
Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.
Brielmann, Aenne A; Bülthoff, Isabelle; Armann, Regine
2014-07-01
Race categorization of faces is a fast and automatic process and is known to affect further face processing profoundly and at earliest stages. Whether processing of own- and other-race faces might rely on different facial cues, as indicated by diverging viewing behavior, is much under debate. We therefore aimed to investigate two open questions in our study: (1) Do observers consider information from distinct facial features informative for race categorization or do they prefer to gain global face information by fixating the geometrical center of the face? (2) Does the fixation pattern, or, if facial features are considered relevant, do these features differ between own- and other-race faces? We used eye tracking to test where European observers look when viewing Asian and Caucasian faces in a race categorization task. Importantly, in order to disentangle centrally located fixations from those towards individual facial features, we presented faces in frontal, half-profile and profile views. We found that observers showed no general bias towards looking at the geometrical center of faces, but rather directed their first fixations towards distinct facial features, regardless of face race. However, participants looked at the eyes more often in Caucasian faces than in Asian faces, and there were significantly more fixations to the nose for Asian compared to Caucasian faces. Thus, observers rely on information from distinct facial features rather than facial information gained by centrally fixating the face. To what extent specific features are looked at is determined by the face's race. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Spisak, Brian R.; Blaker, Nancy M.; Lefevre, Carmen E.; Moore, Fhionna R.; Krebbers, Kleis F. B.
2014-01-01
Previous research indicates that followers tend to contingently match particular leader qualities to evolutionarily consistent situations requiring collective action (i.e., context-specific cognitive leadership prototypes) and information processing undergoes categorization which ranks certain qualities as first-order context-general and others as second-order context-specific. To further investigate this contingent categorization phenomenon we examined the “attractiveness halo”—a first-order facial cue which significantly biases leadership preferences. While controlling for facial attractiveness, we independently manipulated the underlying facial cues of health and intelligence and then primed participants with four distinct organizational dynamics requiring leadership (i.e., competition vs. cooperation between groups and exploratory change vs. stable exploitation). It was expected that the differing requirements of the four dynamics would contingently select for relatively healthier- or intelligent-looking leaders. We found perceived facial intelligence to be a second-order context-specific trait—for instance, in times requiring a leader to address between-group cooperation—whereas perceived health is significantly preferred across all contexts (i.e., a first-order trait). The results also indicate that facial health positively affects perceived masculinity while facial intelligence negatively affects perceived masculinity, which may partially explain leader choice in some of the environmental contexts. The limitations and a number of implications regarding leadership biases are discussed. PMID:25414653
Automatic emotion processing as a function of trait emotional awareness: an fMRI study
Lichev, Vladimir; Sacher, Julia; Ihme, Klas; Rosenberg, Nicole; Quirin, Markus; Lepsien, Jöran; Pampel, André; Rufer, Michael; Grabe, Hans-Jörgen; Kugel, Harald; Kersting, Anette; Villringer, Arno; Lane, Richard D.
2015-01-01
It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level. PMID:25140051
Helt, Molly S; Fein, Deborah A
2016-01-01
Both social input and facial feedback appear to be processed differently by individuals with autism spectrum disorder (ASD). We tested the effects of both of these types of input on laughter in children with ASD. Sensitivity to facial feedback was tested in 43 children with ASD, aged 8-14 years, and 43 typically developing children matched for mental age (6-14), in order to examine whether children with ASD use bodily feedback as an implicit source of information. Specifically, children were asked to view cartoons as they normally would (control condition), and while holding a pencil in their mouth forcing their smiling muscles into activation (feedback condition) while rating their enjoyment of the cartoons. The authors also explored the effects of social input in children with ASD by investigating whether the presence of a caregiver or friend (companion condition), or the presence of a laugh track superimposed upon the cartoon (laugh track condition) increased the children's self-rated enjoyment of cartoons or the amount of positive affect they displayed. Results showed that the group with ASD was less affected by all three experimental conditions, but also that group differences seemed to have been driven by one specific symptom of ASD: restricted range of affect. The strong relationship between restricted affect and insensitivity to facial feedback found in this study sheds light on the implications of restricted affect for social development in ASD.
Impaired perception of facial emotion in developmental prosopagnosia.
Biotti, Federica; Cook, Richard
2016-08-01
Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.
Relations between emotions, display rules, social motives, and facial behaviour.
Zaalberg, Ruud; Manstead, Antony; Fischer, Agneta
2004-02-01
We report research on the relations between emotions, display rules, social motives, and facial behaviour. In Study 1 we used a questionnaire methodology to examine how respondents would react to a funny or a not funny joke told to them by a close friend or a stranger. We assessed display rules and motivations for smiling and/or laughing. Display rules and social motives (partly) mediated the relationship between the experimental manipulations and self-reported facial behaviour. Study 2 was a laboratory experiment in which funny or not funny jokes were told to participants by a male or female stranger. Consistent with hypotheses, hearing a funny joke evoked a stronger motivation to share positive affect by showing longer Duchenne smiling. Contrary to hypotheses, a not funny joke did not elicit greater prosocial motivation by showing longer "polite" smiling, although such a smiling pattern did occur. Rated funniness of the joke and the motivation to share positive affect mediated the relationship between the joke manipulation and facial behaviour. Path analysis was used to explore this mediating process in greater detail.
ERIC Educational Resources Information Center
Shenk, Chad E.; Putnam, Frank W.; Noll, Jennie G.
2013-01-01
Previous research demonstrates that both child maltreatment and intellectual performance contribute uniquely to the accurate identification of facial affect by children and adolescents. The purpose of this study was to extend this research by examining whether child maltreatment affects the accuracy of facial recognition differently at varying…
Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio
2015-12-01
The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.
Faces in-between: evaluations reflect the interplay of facial features and task-dependent fluency.
Winkielman, Piotr; Olszanowski, Michal; Gola, Mateusz
2015-04-01
Facial features influence social evaluations. For example, faces are rated as more attractive and trustworthy when they have more smiling features and also more female features. However, the influence of facial features on evaluations should be qualified by the affective consequences of fluency (cognitive ease) with which such features are processed. Further, fluency (along with its affective consequences) should depend on whether the current task highlights conflict between specific features. Four experiments are presented. In 3 experiments, participants saw faces varying in expressions ranging from pure anger, through mixed expression, to pure happiness. Perceivers first categorized faces either on a control dimension, or an emotional dimension (angry/happy). Thus, the emotional categorization task made "pure" expressions fluent and "mixed" expressions disfluent. Next, participants made social evaluations. Results show that after emotional categorization, but not control categorization, targets with mixed expressions are relatively devalued. Further, this effect is mediated by categorization disfluency. Additional data from facial electromyography reveal that on a basic physiological level, affective devaluation of mixed expressions is driven by their objective ambiguity. The fourth experiment shows that the relative devaluation of mixed faces that vary in gender ambiguity requires a gender categorization task. Overall, these studies highlight that the impact of facial features on evaluation is qualified by their fluency, and that the fluency of features is a function of the current task. The discussion highlights the implications of these findings for research on emotional reactions to ambiguity. (c) 2015 APA, all rights reserved).
Hulvershorn, Leslie A; Finn, Peter; Hummer, Tom A; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit
2013-08-01
Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Non-substance abusing youth (N=19; mean age=12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age=11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Hulvershorn, Leslie A.; Finn, Peter; Hummer, Tom A.; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit
2013-01-01
Background Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Methods Non-substance abusing youth (N = 19; mean age = 12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age = 11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. Results High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. Conclusions These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. PMID:23768841
Unconscious processing of facial affect in children and adolescents.
Killgore, William D S; Yurgelun-Todd, Deborah A
2007-01-01
In a previous study, with adults, we demonstrated that the amygdala and anterior cingulate gyrus are differentially responsive to happy and sad faces presented subliminally. Because the ability to perceive subtle facial signals communicating sadness is an important aspect of prosocial development, and is critical for empathic behavior, we examined this phenomenon from a developmental perspective using a backward masking paradigm. While undergoing functional magnetic resonance imaging (fMRI), 10 healthy adolescent children were presented with a series of happy and sad facial expressions, each lasting 20 ms and masked immediately by a neutral face to prevent conscious awareness of the affective expression. Relative to fixation baseline, masked sad faces activated the right amygdala, whereas masked happy faces failed to activate any of the regions of interest. Direct comparison between masked happy and sad faces revealed valence specific differences in the anterior cingulate gyrus. When the data were compared statistically to our previous sample of adults, the adolescent group showed significantly greater activity in the right amygdala relative to the adults during the masked sad condition. Groups also differed in several non-hypothesized regions. Development of unconscious perception from adolescence into adulthood appears to be accompanied by reduced activity within limbic affect processing systems, and perhaps increased involvement of other cortical and cerebellar systems.
The Turner Syndrome: Cognitive Deficits, Affective Discrimination, and Behavior Problems.
ERIC Educational Resources Information Center
McCauley, Elizabeth; And Others
1987-01-01
The study attemped to link cognitive and social problems seen in girls with Turner syndrome by assessing the girls' ability to process affective cues. Seventeen 9- to 17-year-old girls diagnosed with Turner syndrome were compared to a matched control group on a task which required interpretation of affective intention from facial expression.…
Mavratzakis, Aimee; Herbert, Cornelia; Walla, Peter
2016-01-01
In the current study, electroencephalography (EEG) was recorded simultaneously with facial electromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220-280ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500-750ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500-750ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250ms) than for scenes (500ms) whereas for scenes activity changes were more pronounced over the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Bodily action penetrates affective perception
Rigutti, Sara; Gerbino, Walter
2016-01-01
Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on perception: specifically, facial expressions of emotion are penetrable by action-induced mood. Affective priming by action valence is a candidate mechanism for the influence of observer’s internal states on properties experienced as phenomenally objective and yet loaded with meaning. PMID:26893964
Wolf, Richard C; Pujara, Maia; Baskaya, Mustafa K; Koenigs, Michael
2016-09-01
Facial emotion recognition is a critical aspect of human communication. Since abnormalities in facial emotion recognition are associated with social and affective impairment in a variety of psychiatric and neurological conditions, identifying the neural substrates and psychological processes underlying facial emotion recognition will help advance basic and translational research on social-affective function. Ventromedial prefrontal cortex (vmPFC) has recently been implicated in deploying visual attention to the eyes of emotional faces, although there is mixed evidence regarding the importance of this brain region for recognition accuracy. In the present study of neurological patients with vmPFC damage, we used an emotion recognition task with morphed facial expressions of varying intensities to determine (1) whether vmPFC is essential for emotion recognition accuracy, and (2) whether instructed attention to the eyes of faces would be sufficient to improve any accuracy deficits. We found that vmPFC lesion patients are impaired, relative to neurologically healthy adults, at recognizing moderate intensity expressions of anger and that recognition accuracy can be improved by providing instructions of where to fixate. These results suggest that vmPFC may be important for the recognition of facial emotion through a role in guiding visual attention to emotionally salient regions of faces. Copyright © 2016 Elsevier Ltd. All rights reserved.
Face-elicited ERPs and affective attitude: brain electric microstate and tomography analyses.
Pizzagalli, D; Lehmann, D; Koenig, T; Regard, M; Pascual-Marqui, R D
2000-03-01
Although behavioral studies have demonstrated that normative affective traits modulate the processing of facial and emotionally charged stimuli, direct electrophysiological evidence for this modulation is still lacking. Event-related potential (ERP) data associated with personal, traitlike approach- or withdrawal-related attitude (assessed post-recording and 14 months later) were investigated in 18 subjects during task-free (i.e. unrequested, spontaneous) emotional evaluation of faces. Temporal and spatial aspects of 27 channel ERP were analyzed with microstate analysis and low resolution electromagnetic tomography (LORETA), a new method to compute 3 dimensional cortical current density implemented in the Talairach brain atlas. Microstate analysis showed group differences 132-196 and 196-272 ms poststimulus, with right-shifted electric gravity centers for subjects with negative affective attitude. During these (over subjects reliably identifiable) personality-modulated, face-elicited microstates, LORETA revealed activation of bilateral occipito-temporal regions, reportedly associated with facial configuration extraction processes. Negative compared to positive affective attitude showed higher activity right temporal; positive compared to negative attitude showed higher activity left temporo-parieto-occipital. These temporal and spatial aspects suggest that the subject groups differed in brain activity at early, automatic, stimulus-related face processing steps when structural face encoding (configuration extraction) occurs. In sum, the brain functional microstates associated with affect-related personality features modulate brain mechanisms during face processing already at early information processing stages.
Winkielman, Piotr; Gogolushko, Yekaterina
2018-01-01
Affective stimuli can influence immediate reactions as well as spontaneous behaviors. Much evidence for such influence comes from studies of facial expressions. However, it is unclear whether these effects hold for other affective stimuli, and how the amount of stimulus processing changes the nature of the influence. This paper addresses these issues by comparing the influence on consumption behaviors of emotional pictures and valence-matched words presented at suboptimal and supraliminal durations. In Experiment 1, both suboptimal and supraliminal emotional facial expressions influenced consumption in an affect-congruent, assimilative way. In Experiment 2, pictures of both high- and low-frequency emotional objects congruently influenced consumption. In comparison, words tended to produce incongruent effects. We discuss these findings in light of privileged access theories, which hold that pictures better convey affective meaning than words, and embodiment theories, which hold that pictures better elicit somatosensory and motor responses. PMID:29434556
Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence
Schirmer, Annett; Adolphs, Ralph
2017-01-01
Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here, we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses, and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly non-overlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments. PMID:28173998
Cui, Han; Chen, Yi; Zhong, Weizheng; Yu, Haibo; Li, Zhifeng; He, Yuhai; Yu, Wenlong; Jin, Lei
2016-01-01
Bell's palsy is a kind of peripheral neural disease that cause abrupt onset of unilateral facial weakness. In the pathologic study, it was evidenced that ischemia of facial nerve at the affected side of face existed in Bell's palsy patients. Since the direction of facial nerve blood flow is primarily proximal to distal, facial skin microcirculation would also be affected after the onset of Bell's palsy. Therefore, monitoring the full area of facial skin microcirculation would help to identify the condition of Bell's palsy patients. In this study, a non-invasive, real time and full field imaging technology - laser speckle imaging (LSI) technology was applied for measuring facial skin blood perfusion distribution of Bell's palsy patients. 85 participants with different stage of Bell's palsy were included. Results showed that Bell's palsy patients' facial skin perfusion of affected side was lower than that of the normal side at the region of eyelid, and that the asymmetric distribution of the facial skin perfusion between two sides of eyelid is positively related to the stage of the disease (P < 0.001). During the recovery, the perfusion of affected side of eyelid was increasing to nearly the same with the normal side. This study was a novel application of LSI in evaluating the facial skin perfusion of Bell's palsy patients, and we discovered that the facial skin blood perfusion could reflect the stage of Bell's palsy, which suggested that microcirculation should be investigated in patients with this neurological deficit. It was also suggested LSI as potential diagnostic tool for Bell's palsy.
Memory for faces: the effect of facial appearance and the context in which the face is encountered.
Mattarozzi, Katia; Todorov, Alexander; Codispoti, Maurizio
2015-03-01
We investigated the effects of appearance of emotionally neutral faces and the context in which the faces are encountered on incidental face memory. To approximate real-life situations as closely as possible, faces were embedded in a newspaper article, with a headline that specified an action performed by the person pictured. We found that facial appearance affected memory so that faces perceived as trustworthy or untrustworthy were remembered better than neutral ones. Furthermore, the memory of untrustworthy faces was slightly better than that of trustworthy faces. The emotional context of encoding affected the details of face memory. Faces encountered in a neutral context were more likely to be recognized as only familiar. In contrast, emotionally relevant contexts of encoding, whether pleasant or unpleasant, increased the likelihood of remembering semantic and even episodic details associated with faces. These findings suggest that facial appearance (i.e., perceived trustworthiness) affects face memory. Moreover, the findings support prior evidence that the engagement of emotion processing during memory encoding increases the likelihood that events are not only recognized but also remembered.
Zangara, Andrea; Blair, R J R; Curran, H Valerie
2002-08-01
Accumulating evidence from neuropsychological and neuroimaging research suggests that facial expressions are processed by at least partially separable neurocognitive systems. Recent evidence implies that the processing of different facial expressions may also be dissociable pharmacologically by GABAergic and noradrenergic compounds, although no study has directly compared the two types of drugs. The present study therefore directly compared the effects of a benzodiazepine with those of a beta-adrenergic blocker on the ability to recognise emotional expressions. A double-blind, independent group design was used with 45 volunteers to compare the effects of diazepam (15 mg) and metoprolol (50 mg) with matched placebo. Participants were presented with morphed facial expression stimuli and asked to identify which of the six basic emotions (sadness, happiness, anger, disgust, fear and surprise) were portrayed. Control measures of mood, pulse rate and word recall were also taken. Diazepam selectively impaired participants' ability to recognise expressions of both anger and fear but not other emotional expressions. Errors were mainly mistaking fear for surprise and disgust for anger. Metoprolol did not significantly affect facial expression recognition. These findings are interpreted as providing further support for the suggestion that there are dissociable systems responsible for processing emotional expressions. The results may have implications for understanding why 'paradoxical' aggression is sometimes elicited by benzodiazepines and for extending our psychological understanding of the anxiolytic effects of these drugs.
Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder
Eack, Shaun M.; MAZEFSKY, CARLA A.; Minshew, Nancy J.
2014-01-01
Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with ASD and 30 age- and gender-matched volunteers without ASD to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with ASD. In particular, adults with ASD uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to non-emotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with ASD. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in ASD. PMID:24535689
More than mere mimicry? The influence of emotion on rapid facial reactions to faces.
Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R
2007-05-01
Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.
Misinterpretation of facial expressions of emotion in verbal adults with autism spectrum disorder.
Eack, Shaun M; Mazefsky, Carla A; Minshew, Nancy J
2015-04-01
Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum disorder and 30 age- and gender-matched volunteers without autism spectrum disorder to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with autism spectrum disorder. In particular, adults with autism spectrum disorder uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to nonemotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with autism spectrum disorder. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in autism spectrum disorder. © The Author(s) 2014.
Dynamic Facial Expressions Prime the Processing of Emotional Prosody.
Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Kotz, Sonja A
2018-01-01
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.
Connectomic markers of disease expression, genetic risk and resilience in bipolar disorder
Dima, D; Roberts, R E; Frangou, S
2016-01-01
Bipolar disorder (BD) is characterized by emotional dysregulation and cognitive deficits associated with abnormal connectivity between subcortical—primarily emotional processing regions—and prefrontal regulatory areas. Given the significant contribution of genetic factors to BD, studies in unaffected first-degree relatives can identify neural mechanisms of genetic risk but also resilience, thus paving the way for preventive interventions. Dynamic causal modeling (DCM) and random-effects Bayesian model selection were used to define and assess connectomic phenotypes linked to facial affect processing and working memory in a demographically matched sample of first-degree relatives carefully selected for resilience (n=25), euthymic patients with BD (n=41) and unrelated healthy controls (n=46). During facial affect processing, patients and relatives showed similarly increased frontolimbic connectivity; resilient relatives, however, evidenced additional adaptive hyperconnectivity within the ventral visual stream. During working memory processing, patients displayed widespread hypoconnectivity within the corresponding network. In contrast, working memory network connectivity in resilient relatives was comparable to that of controls. Our results indicate that frontolimbic dysfunction during affect processing could represent a marker of genetic risk to BD, and diffuse hypoconnectivity within the working memory network a marker of disease expression. The association of hyperconnectivity within the affect-processing network with resilience to BD suggests adaptive plasticity that allows for compensatory changes and encourages further investigation of this phenotype in genetic and early intervention studies. PMID:26731443
Connectomic markers of disease expression, genetic risk and resilience in bipolar disorder.
Dima, D; Roberts, R E; Frangou, S
2016-01-05
Bipolar disorder (BD) is characterized by emotional dysregulation and cognitive deficits associated with abnormal connectivity between subcortical-primarily emotional processing regions-and prefrontal regulatory areas. Given the significant contribution of genetic factors to BD, studies in unaffected first-degree relatives can identify neural mechanisms of genetic risk but also resilience, thus paving the way for preventive interventions. Dynamic causal modeling (DCM) and random-effects Bayesian model selection were used to define and assess connectomic phenotypes linked to facial affect processing and working memory in a demographically matched sample of first-degree relatives carefully selected for resilience (n=25), euthymic patients with BD (n=41) and unrelated healthy controls (n=46). During facial affect processing, patients and relatives showed similarly increased frontolimbic connectivity; resilient relatives, however, evidenced additional adaptive hyperconnectivity within the ventral visual stream. During working memory processing, patients displayed widespread hypoconnectivity within the corresponding network. In contrast, working memory network connectivity in resilient relatives was comparable to that of controls. Our results indicate that frontolimbic dysfunction during affect processing could represent a marker of genetic risk to BD, and diffuse hypoconnectivity within the working memory network a marker of disease expression. The association of hyperconnectivity within the affect-processing network with resilience to BD suggests adaptive plasticity that allows for compensatory changes and encourages further investigation of this phenotype in genetic and early intervention studies.
Lee, Anthony J; Mitchem, Dorian G; Wright, Margaret J; Martin, Nicholas G; Keller, Matthew C; Zietsch, Brendan P
2016-01-01
Popular theory suggests that facial averageness is preferred in a partner for genetic benefits to offspring. However, whether facial averageness is associated with genetic quality is yet to be established. Here, we computed an objective measure of facial averageness for a large sample ( N = 1,823) of identical and nonidentical twins and their siblings to test two predictions from the theory that facial averageness reflects genetic quality. First, we use biometrical modelling to estimate the heritability of facial averageness, which is necessary if it reflects genetic quality. We also test for a genetic association between facial averageness and facial attractiveness. Second, we assess whether paternal age at conception (a proxy of mutation load) is associated with facial averageness and facial attractiveness. Our findings are mixed with respect to our hypotheses. While we found that facial averageness does have a genetic component, and a significant phenotypic correlation exists between facial averageness and attractiveness, we did not find a genetic correlation between facial averageness and attractiveness (therefore, we cannot say that the genes that affect facial averageness also affect facial attractiveness) and paternal age at conception was not negatively associated with facial averageness. These findings support some of the previously untested assumptions of the 'genetic benefits' account of facial averageness, but cast doubt on others.
Test-retest reliability of subliminal facial affective priming.
Dannlowski, Udo; Suslow, Thomas
2006-02-01
Since the seminal 1993 demonstrations o f Murphy an d Zajonc, researchers have replicated and extended findings concerning subliminal affective priming. So far, however, no data on test-retest reliability of affective priming effects are available. A subliminal facial affective priming task was administered to 22 healthy individuals (15 women and 7 men) twice about 7 wk. apart. Happy and sad facial expressions were used as affective primes and neutral Chinese ideographs served as target masks, which had to be evaluated. Neutral facial primes and a no-face condition served as baselines. All participants reported not having seen any of the prime faces at either testing session. Priming scores for affective faces compared to the baselines were computed. Acceptable test-retest correlations (rs) of up to .74 were found for the affective priming scores. Although measured almost 2 mo. apart, subliminal affective priming seems to be a temporally stable effect.
Subliminal perception of others' physical pain and pleasure.
Chiesa, Patrizia Andrea; Liuzza, Marco Tullio; Acciarino, Adriano; Aglioti, Salvatore Maria
2015-08-01
Studies indicate that explicit and implicit processing of affectively charged stimuli may be reflected in specific behavioral markers and physiological signatures. This study investigated whether the pleasantness ratings of a neutral target were affected by subliminal perception of pleasant and painful facial expressions. Participants were presented images depicting face of non-famous models being slapped (painful condition), caressed (pleasant condition) or touched (neutral condition) by the right hand of another individual. In particular, we combined the continuous flash suppression technique with the affective misattribution procedure (AMP) to explore subliminal empathic processing. Measures of pupil reactivity along with empathy traits were also collected. Results showed that participants rated the neutral target as less or more likeable congruently with the painful or pleasant facial expression presented, respectively. Pupil dilation was associated both with the implicit attitudes (AMP score) and with empathic concern. Thus, the results provide behavioral and physiological evidence that state-related empathic reactivity can occur at an entirely subliminal level and that it is linked to autonomic responses and empathic traits.
A new look at emotion perception: Concepts speed and shape facial emotion recognition.
Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil
2015-10-01
Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).
Emotional priming with facial exposures in euthymic patients with bipolar disorder.
Kim, Taek Su; Lee, Su Young; Ha, Ra Yeon; Kim, Eosu; An, Suk Kyoon; Ha, Kyooseob; Cho, Hyun-Sang
2011-12-01
People with bipolar disorder have abnormal emotional processing. We investigated the automatic and controlled emotional processing via a priming paradigm with subliminal and supraliminal facial exposure. We compared 20 euthymic bipolar patients and 20 healthy subjects on their performance in subliminal and supraliminal tasks. Priming tasks consisted of three different primes according to facial emotions (happy, sad, and neutral) followed by a neutral face as a target stimulus. The prime stimuli were presented subliminally (17 msec) or supraliminally (1000 msec). In subliminal tasks, both patients and controls judged the neutral target face as significantly more unpleasant (negative judgment shift) when presented with negative emotion primes compared with positive primes. In supraliminal tasks, bipolar subjects showed significant negative judgment shift, whereas healthy subjects did not. There was a significant group × emotion interaction for the judgment rate in supraliminal tasks. Our finding of persistent affective priming even at conscious awareness may suggest that bipolar patients have impaired cognitive control on emotional processing rather than automatically spreading activation of emotion.
Impaired neural processing of dynamic faces in left-onset Parkinson's disease.
Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Sehm, Bernhard; Kotz, Sonja A
2016-02-01
Parkinson's disease (PD) affects patients beyond the motor domain. According to previous evidence, one mechanism that may be impaired in the disease is face processing. However, few studies have investigated this process at the neural level in PD. Moreover, research using dynamic facial displays rather than static pictures is scarce, but highly warranted due to the higher ecological validity of dynamic stimuli. In the present study we aimed to investigate how PD patients process emotional and non-emotional dynamic face stimuli at the neural level using event-related potentials. Since the literature has revealed a predominantly right-lateralized network for dynamic face processing, we divided the group into patients with left (LPD) and right (RPD) motor symptom onset (right versus left cerebral hemisphere predominantly affected, respectively). Participants watched short video clips of happy, angry, and neutral expressions and engaged in a shallow gender decision task in order to avoid confounds of task difficulty in the data. In line with our expectations, the LPD group showed significant face processing deficits compared to controls. While there were no group differences in early, sensory-driven processing (fronto-central N1 and posterior P1), the vertex positive potential, which is considered the fronto-central counterpart of the face-specific posterior N170 component, had a reduced amplitude and delayed latency in the LPD group. This may indicate disturbances of structural face processing in LPD. Furthermore, the effect was independent of the emotional content of the videos. In contrast, static facial identity recognition performance in LPD was not significantly different from controls, and comprehensive testing of cognitive functions did not reveal any deficits in this group. We therefore conclude that PD, and more specifically the predominant right-hemispheric affection in left-onset PD, is associated with impaired processing of dynamic facial expressions, which could be one of the mechanisms behind the often reported problems of PD patients in their social lives. Copyright © 2016 Elsevier Ltd. All rights reserved.
Seubert, Janina; Gregory, Kristen M.; Chamberland, Jessica; Dessirier, Jean-Marc; Lundström, Johan N.
2014-01-01
Scented cosmetic products are used across cultures as a way to favorably influence one's appearance. While crossmodal effects of odor valence on perceived attractiveness of facial features have been demonstrated experimentally, it is unknown whether they represent a phenomenon specific to affective processing. In this experiment, we presented odors in the context of a face battery with systematic feature manipulations during a speeded response task. Modulatory effects of linear increases of odor valence were investigated by juxtaposing subsequent memory-based ratings tasks – one predominantly affective (attractiveness) and a second, cognitive (age). The linear modulation pattern observed for attractiveness was consistent with additive effects of face and odor appraisal. Effects of odor valence on age perception were not linearly modulated and may be the result of cognitive interference. Affective and cognitive processing of faces thus appear to differ in their susceptibility to modulation by odors, likely as a result of privileged access of olfactory stimuli to affective brain networks. These results are critically discussed with respect to potential biases introduced by the preceding speeded response task. PMID:24874703
Enhanced embodied response following ambiguous emotional processing.
Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial
2012-08-01
It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.
Electrophysiology of Cranial Nerve Testing: Trigeminal and Facial Nerves.
Muzyka, Iryna M; Estephan, Bachir
2018-01-01
The clinical examination of the trigeminal and facial nerves provides significant diagnostic value, especially in the localization of lesions in disorders affecting the central and/or peripheral nervous system. The electrodiagnostic evaluation of these nerves and their pathways adds further accuracy and reliability to the diagnostic investigation and the localization process, especially when different testing methods are combined based on the clinical presentation and the electrophysiological findings. The diagnostic uniqueness of the trigeminal and facial nerves is their connectivity and their coparticipation in reflexes commonly used in clinical practice, namely the blink and corneal reflexes. The other reflexes used in the diagnostic process and lesion localization are very nerve specific and add more diagnostic yield to the workup of certain disorders of the nervous system. This article provides a review of commonly used electrodiagnostic studies and techniques in the evaluation and lesion localization of cranial nerves V and VII.
Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi
2016-01-01
Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates. PMID:26761433
Mason, L; Peters, E; Williams, S C; Kumari, V
2017-01-17
Little is known about the psychobiological mechanisms of cognitive behavioural therapy for psychosis (CBTp) and which specific processes are key in predicting favourable long-term outcomes. Following theoretical models of psychosis, this proof-of-concept study investigated whether the long-term recovery path of CBTp completers can be predicted by the neural changes in threat-based social affective processing that occur during CBTp. We followed up 22 participants who had undergone a social affective processing task during functional magnetic resonance imaging along with self-report and clinician-administered symptom measures, before and after receiving CBTp. Monthly ratings of psychotic and affective symptoms were obtained retrospectively across 8 years since receiving CBTp, plus self-reported recovery at final follow-up. We investigated whether these long-term outcomes were predicted by CBTp-led changes in functional connections with dorsal prefrontal cortical and amygdala during the processing of threatening and prosocial facial affect. Although long-term psychotic symptoms were predicted by changes in prefrontal connections during prosocial facial affective processing, long-term affective symptoms were predicted by threat-related amygdalo-inferior parietal lobule connectivity. Greater increases in dorsolateral prefrontal cortex connectivity with amygdala following CBTp also predicted higher subjective ratings of recovery at long-term follow-up. These findings show that reorganisation occurring at the neural level following psychological therapy can predict the subsequent recovery path of people with psychosis across 8 years. This novel methodology shows promise for further studies with larger sample size, which are needed to better examine the sensitivity of psychobiological processes, in comparison to existing clinical measures, in predicting long-term outcomes.
Effects of spatial frequency and location of fearful faces on human amygdala activity.
Morawetz, Carmen; Baudewig, Juergen; Treue, Stefan; Dechent, Peter
2011-01-31
Facial emotion perception plays a fundamental role in interpersonal social interactions. Images of faces contain visual information at various spatial frequencies. The amygdala has previously been reported to be preferentially responsive to low-spatial frequency (LSF) rather than to high-spatial frequency (HSF) filtered images of faces presented at the center of the visual field. Furthermore, it has been proposed that the amygdala might be especially sensitive to affective stimuli in the periphery. In the present study we investigated the impact of spatial frequency and stimulus eccentricity on face processing in the human amygdala and fusiform gyrus using functional magnetic resonance imaging (fMRI). The spatial frequencies of pictures of fearful faces were filtered to produce images that retained only LSF or HSF information. Facial images were presented either in the left or right visual field at two different eccentricities. In contrast to previous findings, we found that the amygdala responds to LSF and HSF stimuli in a similar manner regardless of the location of the affective stimuli in the visual field. Furthermore, the fusiform gyrus did not show differential responses to spatial frequency filtered images of faces. Our findings argue against the view that LSF information plays a crucial role in the processing of facial expressions in the amygdala and of a higher sensitivity to affective stimuli in the periphery. Copyright © 2010 Elsevier B.V. All rights reserved.
Saito, Atsuko; Hamada, Hiroki; Kikusui, Takefumi; Mogi, Kazutaka; Nagasawa, Miho; Mitsui, Shohei; Higuchi, Takashi; Hasegawa, Toshikazu; Hiraki, Kazuo
2014-01-01
The neuropeptide oxytocin plays a central role in prosocial and parental behavior in non-human mammals as well as humans. It has been suggested that oxytocin may affect visual processing of infant faces and emotional reaction to infants. Healthy male volunteers (N = 13) were tested for their ability to detect infant or adult faces among adult or infant faces (facial visual search task). Urine samples were collected from all participants before the study to measure the concentration of oxytocin. Urinary oxytocin positively correlated with performance in the facial visual search task. However, task performance and its correlation with oxytocin concentration did not differ between infant faces and adult faces. Our data suggests that endogenous oxytocin is related to facial visual cognition, but does not promote infant-specific responses in unmarried men who are not fathers.
Interference among the Processing of Facial Emotion, Face Race, and Face Gender.
Li, Yongna; Tse, Chi-Shing
2016-01-01
People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).
Interference among the Processing of Facial Emotion, Face Race, and Face Gender
Li, Yongna; Tse, Chi-Shing
2016-01-01
People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621
How does context affect assessments of facial emotion? The role of culture and age
Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara
2010-01-01
People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. PMID:21038967
Attractiveness bias: A cognitive explanation.
Schein, Stevie S; Trujillo, Logan T; Langlois, Judith H
2017-01-01
According to cognitive averaging theory, preferences for attractive faces result from their similarity to facial prototypes, the categorical central tendencies of a population of faces. Prototypical faces are processed more fluently, resulting in increased positive affect in the viewer.
Accuracy of computer-assisted navigation: significant augmentation by facial recognition software.
Glicksman, Jordan T; Reger, Christine; Parasher, Arjun K; Kennedy, David W
2017-09-01
Over the past 20 years, image guidance navigation has been used with increasing frequency as an adjunct during sinus and skull base surgery. These devices commonly utilize surface registration, where varying pressure of the registration probe and loss of contact with the face during the skin tracing process can lead to registration inaccuracies, and the number of registration points incorporated is necessarily limited. The aim of this study was to evaluate the use of novel facial recognition software for image guidance registration. Consecutive adults undergoing endoscopic sinus surgery (ESS) were prospectively studied. Patients underwent image guidance registration via both conventional surface registration and facial recognition software. The accuracy of both registration processes were measured at the head of the middle turbinate (MTH), middle turbinate axilla (MTA), anterior wall of sphenoid sinus (SS), and nasal tip (NT). Forty-five patients were included in this investigation. Facial recognition was accurate to within a mean of 0.47 mm at the MTH, 0.33 mm at the MTA, 0.39 mm at the SS, and 0.36 mm at the NT. Facial recognition was more accurate than surface registration at the MTH by an average of 0.43 mm (p = 0.002), at the MTA by an average of 0.44 mm (p < 0.001), and at the SS by an average of 0.40 mm (p < 0.001). The integration of facial recognition software did not adversely affect registration time. In this prospective study, automated facial recognition software significantly improved the accuracy of image guidance registration when compared to conventional surface registration. © 2017 ARS-AAOA, LLC.
The influence of context on distinct facial expressions of disgust.
Reschke, Peter J; Walle, Eric A; Knothe, Jennifer M; Lopez, Lukas D
2018-06-11
Face perception is susceptible to contextual influence and perceived physical similarities between emotion cues. However, studies often use structurally homogeneous facial expressions, making it difficult to explore how within-emotion variability in facial configuration affects emotion perception. This study examined the influence of context on the emotional perception of categorically identical, yet physically distinct, facial expressions of disgust. Participants categorized two perceptually distinct disgust facial expressions, "closed" (i.e., scrunched nose, closed mouth) and "open" (i.e., scrunched nose, open mouth, protruding tongue), that were embedded in contexts comprising emotion postures and scenes. Results demonstrated that the effect of nonfacial elements was significantly stronger for "open" disgust facial expressions than "closed" disgust facial expressions. These findings provide support that physical similarity within discrete categories of facial expressions is mutable and plays an important role in affective face perception. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Selective attention modulates early human evoked potentials during emotional face-voice processing.
Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A
2015-04-01
Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.
Facial trauma among victims of terrestrial transport accidents.
d'Avila, Sérgio; Barbosa, Kevan Guilherme Nóbrega; Bernardino, Ítalo de Macedo; da Nóbrega, Lorena Marques; Bento, Patrícia Meira; E Ferreira, Efigênia Ferreira
2016-01-01
In developing countries, terrestrial transport accidents - TTA, especially those involving automobiles and motorcycles - are a major cause of facial trauma, surpassing urban violence. This cross-sectional census study attempted to determine facial trauma occurrence with terrestrial transport accidents etiology, involving cars, motorcycles, or accidents with pedestrians in the northeastern region of Brazil, and examine victims' socio-demographic characteristics. Morbidity data from forensic service reports of victims who sought care from January to December 2012 were analyzed. Altogether, 2379 reports were evaluated, of which 673 were related to terrestrial transport accidents and 103 involved facial trauma. Three previously trained and calibrated researchers collected data using a specific form. Facial trauma occurrence rate was 15.3% (n=103). The most affected age group was 20-29 years (48.3%), and more men than women were affected (2.81:1). Motorcycles were involved in the majority of accidents resulting in facial trauma (66.3%). The occurrence of facial trauma in terrestrial transport accident victims tends to affect a greater proportion of young and male subjects, and the most prevalent accidents involve motorcycles. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Woolley, J D; Chuang, B; Fussell, C; Scherer, S; Biagianti, B; Fulford, D; Mathalon, D H; Vinogradov, S
2017-05-01
Blunted facial affect is a common negative symptom of schizophrenia. Additionally, assessing the trustworthiness of faces is a social cognitive ability that is impaired in schizophrenia. Currently available pharmacological agents are ineffective at improving either of these symptoms, despite their clinical significance. The hypothalamic neuropeptide oxytocin has multiple prosocial effects when administered intranasally to healthy individuals and shows promise in decreasing negative symptoms and enhancing social cognition in schizophrenia. Although two small studies have investigated oxytocin's effects on ratings of facial trustworthiness in schizophrenia, its effects on facial expressivity have not been investigated in any population. We investigated the effects of oxytocin on facial emotional expressivity while participants performed a facial trustworthiness rating task in 33 individuals with schizophrenia and 35 age-matched healthy controls using a double-blind, placebo-controlled, cross-over design. Participants rated the trustworthiness of presented faces interspersed with emotionally evocative photographs while being video-recorded. Participants' facial expressivity in these videos was quantified by blind raters using a well-validated manualized approach (i.e. the Facial Expression Coding System; FACES). While oxytocin administration did not affect ratings of facial trustworthiness, it significantly increased facial expressivity in individuals with schizophrenia (Z = -2.33, p = 0.02) and at trend level in healthy controls (Z = -1.87, p = 0.06). These results demonstrate that oxytocin administration can increase facial expressivity in response to emotional stimuli and suggest that oxytocin may have the potential to serve as a treatment for blunted facial affect in schizophrenia.
Novel Noninvasive Brain Disease Detection System Using a Facial Image Sensor
Shu, Ting; Zhang, Bob; Tang, Yuan Yan
2017-01-01
Brain disease including any conditions or disabilities that affect the brain is fast becoming a leading cause of death. The traditional diagnostic methods of brain disease are time-consuming, inconvenient and non-patient friendly. As more and more individuals undergo examinations to determine if they suffer from any form of brain disease, developing noninvasive, efficient, and patient friendly detection systems will be beneficial. Therefore, in this paper, we propose a novel noninvasive brain disease detection system based on the analysis of facial colors. The system consists of four components. A facial image is first captured through a specialized sensor, where four facial key blocks are next located automatically from the various facial regions. Color features are extracted from each block to form a feature vector for classification via the Probabilistic Collaborative based Classifier. To thoroughly test the system and its performance, seven facial key block combinations were experimented. The best result was achieved using the second facial key block, where it showed that the Probabilistic Collaborative based Classifier is the most suitable. The overall performance of the proposed system achieves an accuracy −95%, a sensitivity −94.33%, a specificity −95.67%, and an average processing time (for one sample) of <1 min at brain disease detection. PMID:29292716
Li, Yuan Hang; Tottenham, Nim
2013-04-01
A growing literature suggests that the self-face is involved in processing the facial expressions of others. The authors experimentally activated self-face representations to assess its effects on the recognition of dynamically emerging facial expressions of others. They exposed participants to videos of either their own faces (self-face prime) or faces of others (nonself-face prime) prior to a facial expression judgment task. Their results show that experimentally activating self-face representations results in earlier recognition of dynamically emerging facial expression. As a group, participants in the self-face prime condition recognized expressions earlier (when less affective perceptual information was available) compared to participants in the nonself-face prime condition. There were individual differences in performance, such that poorer expression identification was associated with higher autism traits (in this neurocognitively healthy sample). However, when randomized into the self-face prime condition, participants with high autism traits performed as well as those with low autism traits. Taken together, these data suggest that the ability to recognize facial expressions in others is linked with the internal representations of our own faces. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Resting RSA Is Associated with Natural and Self-Regulated Responses to Negative Emotional Stimuli
ERIC Educational Resources Information Center
Demaree, Heath A.; Robinson, Jennifer L.; Everhart, D. Erik; Schmeichel, Brandon J.
2004-01-01
Resting respiratory sinus arrhythmia (RSA) was assessed among 111 adult participants. These individuals were then asked to watch a positive or negative affective film in either a natural manner or while exaggerating their facial response. Facial reactions to the film were video-recorded and subsequently rated in terms of facial affect.…
The Relation of Facial Affect Recognition and Empathy to Delinquency in Youth Offenders
ERIC Educational Resources Information Center
Carr, Mary B.; Lutjemeier, John A.
2005-01-01
Associations among facial affect recognition, empathy, and self-reported delinquency were studied in a sample of 29 male youth offenders at a probation placement facility. Youth offenders were asked to recognize facial expressions of emotions from adult faces, child faces, and cartoon faces. Youth offenders also responded to a series of statements…
ERIC Educational Resources Information Center
Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan
2014-01-01
Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…
Impaired holistic processing of unfamiliar individual faces in acquired prosopagnosia.
Ramon, Meike; Busigny, Thomas; Rossion, Bruno
2010-03-01
Prosopagnosia is an impairment at individualizing faces that classically follows brain damage. Several studies have reported observations supporting an impairment of holistic/configural face processing in acquired prosopagnosia. However, this issue may require more compelling evidence as the cases reported were generally patients suffering from integrative visual agnosia, and the sensitivity of the paradigms used to measure holistic/configural face processing in normal individuals remains unclear. Here we tested a well-characterized case of acquired prosopagnosia (PS) with no object recognition impairment, in five behavioral experiments (whole/part and composite face paradigms with unfamiliar faces). In all experiments, for normal observers we found that processing of a given facial feature was affected by the location and identity of the other features in a whole face configuration. In contrast, the patient's results over these experiments indicate that she encodes local facial information independently of the other features embedded in the whole facial context. These observations and a survey of the literature indicate that abnormal holistic processing of the individual face may be a characteristic hallmark of prosopagnosia following brain damage, perhaps with various degrees of severity. Copyright (c) 2009 Elsevier Ltd. All rights reserved.
Humor drawings evoked temporal and spectral EEG processes
Kuo, Hsien-Chu; Chuang, Shang-Wen
2017-01-01
Abstract The study aimed to explore the humor processing elicited through the manipulation of artistic drawings. Using the Comprehension–Elaboration Theory of humor as the main research background, the experiment manipulated the head portraits of celebrities based on the independent variables of facial deformation (large/small) and addition of affective features (positive/negative). A 64-channel electroencephalography was recorded in 30 participants while viewing the incongruous drawings of celebrities. The electroencephalography temporal and spectral responses were measured during the three stages of humor which included incongruity detection, incongruity comprehension and elaboration of humor. Analysis of event-related potentials indicated that for humorous vs non-humorous drawings, facial deformation and the addition of affective features significantly affected the degree of humor elicited, specifically: large > small deformation; negative > positive affective features. The N170, N270, N400, N600-800 and N900-1200 components showed significant differences, particularly in the right prefrontal and frontal regions. Analysis of event-related spectral perturbation showed significant differences in the theta band evoked in the anterior cingulate cortex, parietal region and posterior cingulate cortex; and in the alpha and beta bands in the motor areas. These regions are involved in emotional processing, memory retrieval, and laughter and feelings of amusement induced by elaboration of the situation. PMID:28402573
Mothers' pupillary responses to infant facial expressions.
Yrttiaho, Santeri; Niehaus, Dana; Thomas, Eileen; Leppänen, Jukka M
2017-02-06
Human parental care relies heavily on the ability to monitor and respond to a child's affective states. The current study examined pupil diameter as a potential physiological index of mothers' affective response to infant facial expressions. Pupillary time-series were measured from 86 mothers of young infants in response to an array of photographic infant faces falling into four emotive categories based on valence (positive vs. negative) and arousal (mild vs. strong). Pupil dilation was highly sensitive to the valence of facial expressions, being larger for negative vs. positive facial expressions. A separate control experiment with luminance-matched non-face stimuli indicated that the valence effect was specific to facial expressions and cannot be explained by luminance confounds. Pupil response was not sensitive to the arousal level of facial expressions. The results show the feasibility of using pupil diameter as a marker of mothers' affective responses to ecologically valid infant stimuli and point to a particularly prompt maternal response to infant distress cues.
Mokhtari, Setareh; Buttle, Heather
2015-01-01
We examined the effect of induced mood, varying in valence and longevity, on local processing of emotional faces. It was found that negative facial expression conveyed by the global level of the face interferes with efficient processing of the local features. The results also showed that the duration of involvement with a mood influenced the local processing. We observed that attending to the local level of faces is not different in short-lived happy and sad mood states. However, as the mood state is experienced for a longer period, local processing was impaired in happy mood compared to sad mood. Taken together, we concluded that both facial expressions and affective states influence processing of the local parts of faces. Moreover, we suggest that mediating factors like the duration of involvement with the mood play a role in the interrelation between mood, attention, and perception. PMID:25883696
Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G
2013-11-01
The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Miyahara, Motohide; Bray, Anne; Tsujii, Masatsugu; Fujita, Chikako; Sugiyama, Toshiro
2007-01-01
This study used a choice reaction-time paradigm to test the perceived impairment of facial affect recognition in Asperger's disorder. Twenty teenagers with Asperger's disorder and 20 controls were compared with respect to the latency and accuracy of response to happy or disgusted facial expressions, presented in cartoon or real images and in…
Discrimination of gender using facial image with expression change
NASA Astrophysics Data System (ADS)
Kuniyada, Jun; Fukuda, Takahiro; Terada, Kenji
2005-12-01
By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.
Agency and facial emotion judgment in context.
Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai
2013-06-01
Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.
Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C
2007-11-01
Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.
Hyvärinen, Antti; Tarkka, Ina M; Mervaala, Esa; Pääkkönen, Ari; Valtonen, Hannu; Nuutinen, Juhani
2008-12-01
The purpose of this study was to assess clinical and neurophysiological changes after 6 mos of transcutaneous electrical stimulation in patients with unresolved facial nerve paralysis. A pilot case series of 10 consecutive patients with chronic facial nerve paralysis either of idiopathic origin or because of herpes zoster oticus participated in this open study. All patients received below sensory threshold transcutaneous electrical stimulation for 6 mos for their facial nerve paralysis. The intervention consisted of gradually increasing the duration of electrical stimulation of three sites on the affected area for up to 6 hrs/day. Assessments of the facial nerve function were performed using the House-Brackmann clinical scale and neurophysiological measurements of compound motor action potential distal latencies on the affected and nonaffected sides. Patients were tested before and after the intervention. A significant improvement was observed in the facial nerve upper branch compound motor action potential distal latency on the affected side in all patients. An improvement of one grade in House-Brackmann scale was observed and some patients also reported subjective improvement. Transcutaneous electrical stimulation treatment may have a positive effect on unresolved facial nerve paralysis. This study illustrates a possibly effective treatment option for patients with the chronic facial paresis with no other expectations of recovery.
A View of the Therapy for Bell's Palsy Based on Molecular Biological Analyses of Facial Muscles.
Moriyama, Hiroshi; Mitsukawa, Nobuyuki; Itoh, Masahiro; Otsuka, Naruhito
2017-12-01
Details regarding the molecular biological features of Bell's palsy have not been widely reported in textbooks. We genetically analyzed facial muscles and clarified these points. We performed genetic analysis of facial muscle specimens from Japanese patients with severe (House-Brackmann facial nerve grading system V) and moderate (House-Brackmann facial nerve grading system III) dysfunction due to Bell's palsy. Microarray analysis of gene expression was performed using specimens from the healthy and affected sides, and gene expression was compared. Changes in gene expression were defined as an affected side/healthy side ratio of >1.5 or <0.5. We observed that the gene expression in Bell's palsy changes with the degree of facial nerve palsy. Especially, muscle, neuron, and energy category genes tended to fluctuate with the degree of facial nerve palsy. It is expected that this study will aid in the development of new treatments and diagnostic/prognostic markers based on the severity of facial nerve palsy.
Multiple faces of pain: effects of chronic pain on the brain regulation of facial expression
Vachon-Presseau, Etienne; Roy, Mathieu; Woo, Choong-Wan; Kunz, Miriam; Martel, Marc-Olivier; Sullivan, Michael J.; Jackson, Philip L.; Wager, Tor D.; Rainville, Pierre
2018-01-01
Pain behaviors are shaped by social demands and learning processes, and chronic pain has been previously suggested to affect their meaning. In this study, we combined functional magnetic resonance imaging with in-scanner video recording during thermal pain stimulations and use multilevel mediation analyses to study the brain mediators of pain facial expressions and the perception of pain intensity (self-reports) in healthy individuals and patients with chronic back pain (CBP). Behavioral data showed that the relation between pain expression and pain report was disrupted in CBP. In both patients with CBP and healthy controls, brain activity varying on a trial-by-trial basis with pain facial expressions was mainly located in the primary motor cortex and completely dissociated from the pattern of brain activity varying with pain intensity ratings. Stronger activity was observed in CBP specifically during pain facial expressions in several nonmotor brain regions such as the medial prefrontal cortex, the precuneus, and the medial temporal lobe. In sharp contrast, no moderating effect of chronic pain was observed on brain activity associated with pain intensity ratings. Our results demonstrate that pain facial expressions and pain intensity ratings reflect different aspects of pain processing and support psychosocial models of pain suggesting that distinctive mechanisms are involved in the regulation of pain behaviors in chronic pain. PMID:27411160
The effect of facial makeup on the frequency of drivers stopping for hitchhikers.
Guéguen, Nicolas; Lamy, Lubomir
2013-08-01
Judgments of photographs have shown that makeup enhances ratings of women's facial attractiveness. The present study assessed whether makeup affects the stopping behavior of drivers in response to a hitchhiker's signal. Four 20- to 22-year-old female confederates wore facial makeup, or not, while pretending to be hitchhiking. Frequency of stopping was compared in 1,600 male and female drivers. Facial makeup was associated with an increase in the number of male drivers who stopped to offer a ride. Makeup did not affect frequency of stopping by female drivers.
Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.
Karl, Christian; Hewig, Johannes; Osinsky, Roman
2016-10-01
There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.
Interaction of prime and target in the subliminal affective priming effect.
Haneda, Kaoruko; Nomura, Michio; Iidaka, Tetsuya; Ohira, Hideki
2003-04-01
It has been found that an emotional stimulus such as a facial expression presented subliminally can affect subsequent information processing and behavior, usually by shifting evaluation of a subsequent stimulus to a valence congruent with the previous stimulus. This phenomenon is called subliminal affective priming. The present study was conducted to replicate and expand previous findings by investigating interaction of primes and targets in the affective priming effect. Two conditions were used. Prime (subliminal presentation 35 msec.) of an angry face of a woman and a No Prime control condition. Just after presentation of the prime, an ambiguous angry face or an emotionally neutral face was presented above the threshold of awareness (500 msec.). 12 female undergraduate women judged categories of facial expressions (Anger, Neutral, or Happiness) for the target faces. Analysis indicated that the Anger primes significantly facilitated judgment of anger for the ambiguous angry faces; however, the priming effect of the Anger primes was not observed for neutral faces. Consequently, the present finding suggested that a subliminal affective priming effect should be more prominent when affective valence of primes and targets is congruent.
Ferrucci, Roberta; Giannicola, Gaia; Rosa, Manuela; Fumagalli, Manuela; Boggio, Paulo Sergio; Hallett, Mark; Zago, Stefano; Priori, Alberto
2012-01-01
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.
ERIC Educational Resources Information Center
Vlamings, Petra H. J. M.; Jonkman, Lisa M.; Kemner, Chantal
2010-01-01
There is converging evidence for the presence of a fast subcortical face-processing route that operates on global face characteristics in the mature brain. Until now, little has been known about the development of such a route, which is surprising given suggestions that this fast subcortical face-processing route might be affected in…
Facial Features: What Women Perceive as Attractive and What Men Consider Attractive.
Muñoz-Reyes, José Antonio; Iglesias-Julios, Marta; Pita, Miguel; Turiegano, Enrique
2015-01-01
Attractiveness plays an important role in social exchange and in the ability to attract potential mates, especially for women. Several facial traits have been described as reliable indicators of attractiveness in women, but very few studies consider the influence of several measurements simultaneously. In addition, most studies consider just one of two assessments to directly measure attractiveness: either self-evaluation or men's ratings. We explored the relationship between these two estimators of attractiveness and a set of facial traits in a sample of 266 young Spanish women. These traits are: facial fluctuating asymmetry, facial averageness, facial sexual dimorphism, and facial maturity. We made use of the advantage of having recently developed methodologies that enabled us to measure these variables in real faces. We also controlled for three other widely used variables: age, body mass index and waist-to-hip ratio. The inclusion of many different variables allowed us to detect any possible interaction between the features described that could affect attractiveness perception. Our results show that facial fluctuating asymmetry is related both to self-perceived and male-rated attractiveness. Other facial traits are related only to one direct attractiveness measurement: facial averageness and facial maturity only affect men's ratings. Unmodified faces are closer to natural stimuli than are manipulated photographs, and therefore our results support the importance of employing unmodified faces to analyse the factors affecting attractiveness. We also discuss the relatively low equivalence between self-perceived and male-rated attractiveness and how various anthropometric traits are relevant to them in different ways. Finally, we highlight the need to perform integrated-variable studies to fully understand female attractiveness.
Facial Features: What Women Perceive as Attractive and What Men Consider Attractive
Muñoz-Reyes, José Antonio; Iglesias-Julios, Marta; Pita, Miguel; Turiegano, Enrique
2015-01-01
Attractiveness plays an important role in social exchange and in the ability to attract potential mates, especially for women. Several facial traits have been described as reliable indicators of attractiveness in women, but very few studies consider the influence of several measurements simultaneously. In addition, most studies consider just one of two assessments to directly measure attractiveness: either self-evaluation or men's ratings. We explored the relationship between these two estimators of attractiveness and a set of facial traits in a sample of 266 young Spanish women. These traits are: facial fluctuating asymmetry, facial averageness, facial sexual dimorphism, and facial maturity. We made use of the advantage of having recently developed methodologies that enabled us to measure these variables in real faces. We also controlled for three other widely used variables: age, body mass index and waist-to-hip ratio. The inclusion of many different variables allowed us to detect any possible interaction between the features described that could affect attractiveness perception. Our results show that facial fluctuating asymmetry is related both to self-perceived and male-rated attractiveness. Other facial traits are related only to one direct attractiveness measurement: facial averageness and facial maturity only affect men's ratings. Unmodified faces are closer to natural stimuli than are manipulated photographs, and therefore our results support the importance of employing unmodified faces to analyse the factors affecting attractiveness. We also discuss the relatively low equivalence between self-perceived and male-rated attractiveness and how various anthropometric traits are relevant to them in different ways. Finally, we highlight the need to perform integrated-variable studies to fully understand female attractiveness. PMID:26161954
Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.
Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M
2014-09-01
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.
Curvilinear relationship between phonological working memory load and social-emotional modulation
Mano, Quintino R.; Brown, Gregory G.; Bolden, Khalima; Aupperle, Robin; Sullivan, Sarah; Paulus, Martin P.; Stein, Murray B.
2015-01-01
Accumulating evidence suggests that working memory load is an important factor for the interplay between cognitive and facial-affective processing. However, it is unclear how distraction caused by perception of faces interacts with load-related performance. We developed a modified version of the delayed match-to-sample task wherein task-irrelevant facial distracters were presented early in the rehearsal of pseudoword memoranda that varied incrementally in load size (1-syllable, 2-syllables, or 3-syllables). Facial distracters displayed happy, sad, or neutral expressions in Experiment 1 (N=60) and happy, fearful, or neutral expressions in Experiment 2 (N=29). Facial distracters significantly disrupted task performance in the intermediate load condition (2-syllable) but not in the low or high load conditions (1- and 3-syllables, respectively), an interaction replicated and generalised in Experiment 2. All facial distracters disrupted working memory in the intermediate load condition irrespective of valence, suggesting a primary and general effect of distraction caused by faces. However, sad and fearful faces tended to be less disruptive than happy faces, suggesting a secondary and specific valence effect. Working memory appears to be most vulnerable to social-emotional information at intermediate loads. At low loads, spare capacity is capable of accommodating the combinatorial load (1-syllable plus facial distracter), whereas high loads maximised capacity and deprived facial stimuli from occupying working memory slots to cause disruption. PMID:22928750
Mondloch, Catherine J
2012-02-01
The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.
The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults
LoBue, Vanessa; Thrasher, Cat
2014-01-01
Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants. PMID:25610415
The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.
LoBue, Vanessa; Thrasher, Cat
2014-01-01
Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.
Ravaja, Niklas
2004-01-01
We examined the moderating influence of dispositional behavioral inhibition system and behavioral activation system (BAS) sensitivities, Negative Affect, and Positive Affect on the relationship between a small moving vs. static facial image and autonomic responses when viewing/listening to news messages read by a newscaster among 36 young adults. Autonomic parameters measured were respiratory sinus arrhythmia (RSA), low-frequency (LF) component of heart rate variability (HRV), electrodermal activity, and pulse transit time (PTT). The results showed that dispositional BAS sensitivity, particularly BAS Fun Seeking, and Negative Affect interacted with facial image motion in predicting autonomic nervous system activity. A moving facial image was related to lower RSA and LF component of HRV and shorter PTTs as compared to a static facial image among high BAS individuals. Even a small talking facial image may contribute to sustained attentional engagement among high BAS individuals, given that the BAS directs attention toward the positive cue and a moving social stimulus may act as a positive incentive for high BAS individuals.
Amphetamine as a social drug: Effects of d-amphetamine on social processing and behavior
Wardle, Margaret C.; Garner, Matthew J.; Munafò, Marcus R.; de Wit, Harriet
2012-01-01
Rationale Drug users often report using drugs to enhance social situations, and empirical studies support the idea that drugs increase both social behavior and the value of social interactions. One way drugs may affect social behavior is by altering social processing, for example by decreasing perceptions of negative emotion in others. Objectives We examined effects of d-amphetamine on processing of emotional facial expressions, and on the social behavior of talking. We predicted amphetamine would enhance attention, identification and responsivity to positive expressions, and that this in turn would predict increased talkativeness. Methods Over three sessions, 36 healthy normal adults received placebo, 10mg, and 20mg d-amphetamine under counterbalanced double-blind conditions. At each session we measured processing of happy, fearful, sad and angry expressions using an attentional visual probe task, a dynamic emotion identification task, and measures of facial muscle activity. We also measured talking. Results Amphetamine decreased the threshold for identifying all emotions, increased negative facial responses to sad expressions, and increased talkativeness. Contrary to our hypotheses, amphetamine did not alter attention to, identification of or facial responses to positive emotions specifically. Interestingly, the drug decreased the threshold to identify all emotions, and this effect was uniquely related to increased talkativeness, even after controlling for overall sensitivity to amphetamine. Conclusions The results suggest that amphetamine may encourage sociability by increasing sensitivity to subtle emotional expressions. These findings suggest novel social mechanisms that may contribute to the rewarding effects of amphetamine. PMID:22526538
How to Avoid Facial Nerve Injury in Mastoidectomy?
Ryu, Nam-Gyu
2016-01-01
Unexpected iatrogenic facial nerve paralysis not only affects facial disfiguration, but also imposes a devastating effect on the social, psychological, and economic aspects of an affected person's life at once. The aims of this study were to postulate where surgeons had mistakenly drilled or where obscured by granulations or by fibrous bands and to look for surgical approach with focused on the safety of facial nerve in mastoid surgery. We had found 14 cases of iatrogenic facial nerve injury (IFNI) during mastoid surgery for 5 years in Korea. The medical records of all the patients were obtained and analyzed injured site of facial nerve segment with surgical technique of mastoidectomy. Eleven patients underwent facial nerve exploration and three patients had conservative management. 43% (6 cases) of iatrogenic facial nerve injuries had occurred in tympanic segment, 28.5% (4 cases) of injuries in second genu combined with tympanic segment, and 28.5% (4 cases) of injuries in mastoid segment. Surgeons should try to identify the facial nerve using available landmarks and be kept in mind the anomalies of the facial nerve. With use of intraoperative facial nerve monitoring, the avoidance of in order to avoid IFNI would be possible in more cases. Many authors emphasized the importance of intraoperative facial nerve monitoring, even in primary otologic surgery. However, anatomical understanding of intratemporal landmarks with meticulous dissection could not be emphasized as possible to prevent IFNI. PMID:27626078
How does context affect assessments of facial emotion? The role of culture and age.
Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara
2011-03-01
People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.
Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R
2013-10-01
Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.
Menzel, Claudia; Hayn-Leichsenring, Gregor U; Langner, Oliver; Wiese, Holger; Redies, Christoph
2015-01-01
We investigated whether low-level processed image properties that are shared by natural scenes and artworks - but not veridical face photographs - affect the perception of facial attractiveness and age. Specifically, we considered the slope of the radially averaged Fourier power spectrum in a log-log plot. This slope is a measure of the distribution of special frequency power in an image. Images of natural scenes and artworks possess - compared to face images - a relatively shallow slope (i.e., increased high spatial frequency power). Since aesthetic perception might be based on the efficient processing of images with natural scene statistics, we assumed that the perception of facial attractiveness might also be affected by these properties. We calculated Fourier slope and other beauty-associated measurements in face images and correlated them with ratings of attractiveness and age of the depicted persons (Study 1). We found that Fourier slope - in contrast to the other tested image properties - did not predict attractiveness ratings when we controlled for age. In Study 2A, we overlaid face images with random-phase patterns with different statistics. Patterns with a slope similar to those in natural scenes and artworks resulted in lower attractiveness and higher age ratings. In Studies 2B and 2C, we directly manipulated the Fourier slope of face images and found that images with shallower slopes were rated as more attractive. Additionally, attractiveness of unaltered faces was affected by the Fourier slope of a random-phase background (Study 3). Faces in front of backgrounds with statistics similar to natural scenes and faces were rated as more attractive. We conclude that facial attractiveness ratings are affected by specific image properties. An explanation might be the efficient coding hypothesis.
Langner, Oliver; Wiese, Holger; Redies, Christoph
2015-01-01
We investigated whether low-level processed image properties that are shared by natural scenes and artworks – but not veridical face photographs – affect the perception of facial attractiveness and age. Specifically, we considered the slope of the radially averaged Fourier power spectrum in a log-log plot. This slope is a measure of the distribution of special frequency power in an image. Images of natural scenes and artworks possess – compared to face images – a relatively shallow slope (i.e., increased high spatial frequency power). Since aesthetic perception might be based on the efficient processing of images with natural scene statistics, we assumed that the perception of facial attractiveness might also be affected by these properties. We calculated Fourier slope and other beauty-associated measurements in face images and correlated them with ratings of attractiveness and age of the depicted persons (Study 1). We found that Fourier slope – in contrast to the other tested image properties – did not predict attractiveness ratings when we controlled for age. In Study 2A, we overlaid face images with random-phase patterns with different statistics. Patterns with a slope similar to those in natural scenes and artworks resulted in lower attractiveness and higher age ratings. In Studies 2B and 2C, we directly manipulated the Fourier slope of face images and found that images with shallower slopes were rated as more attractive. Additionally, attractiveness of unaltered faces was affected by the Fourier slope of a random-phase background (Study 3). Faces in front of backgrounds with statistics similar to natural scenes and faces were rated as more attractive. We conclude that facial attractiveness ratings are affected by specific image properties. An explanation might be the efficient coding hypothesis. PMID:25835539
Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.
2015-01-01
Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…
Amblyopia Associated with Congenital Facial Nerve Paralysis.
Iwamura, Hitoshi; Kondo, Kenji; Sawamura, Hiromasa; Baba, Shintaro; Yasuhara, Kazuo; Yamasoba, Tatsuya
2016-01-01
The association between congenital facial paralysis and visual development has not been thoroughly studied. Of 27 pediatric cases of congenital facial paralysis, we identified 3 patients who developed amblyopia, a visual acuity decrease caused by abnormal visual development, as comorbidity. These 3 patients had facial paralysis in the periocular region and developed amblyopia on the paralyzed side. They started treatment by wearing an eye patch immediately after diagnosis and before the critical visual developmental period; all patients responded to the treatment. Our findings suggest that the incidence of amblyopia in the cases of congenital facial paralysis, particularly the paralysis in the periocular region, is higher than that in the general pediatric population. Interestingly, 2 of the 3 patients developed anisometropic amblyopia due to the hyperopia of the affected eye, implying that the periocular facial paralysis may have affected the refraction of the eye through yet unspecified mechanisms. Therefore, the physicians who manage facial paralysis should keep this pathology in mind, and when they see pediatric patients with congenital facial paralysis involving the periocular region, they should consult an ophthalmologist as soon as possible. © 2016 S. Karger AG, Basel.
The facial nerve: anatomy and associated disorders for oral health professionals.
Takezawa, Kojiro; Townsend, Grant; Ghabriel, Mounir
2018-04-01
The facial nerve, the seventh cranial nerve, is of great clinical significance to oral health professionals. Most published literature either addresses the central connections of the nerve or its peripheral distribution but few integrate both of these components and also highlight the main disorders affecting the nerve that have clinical implications in dentistry. The aim of the current study is to provide a comprehensive description of the facial nerve. Multiple aspects of the facial nerve are discussed and integrated, including its neuroanatomy, functional anatomy, gross anatomy, clinical problems that may involve the nerve, and the use of detailed anatomical knowledge in the diagnosis of the site of facial nerve lesion in clinical neurology. Examples are provided of disorders that can affect the facial nerve during its intra-cranial, intra-temporal and extra-cranial pathways, and key aspects of clinical management are discussed. The current study is complemented by original detailed dissections and sketches that highlight key anatomical features and emphasise the extent and nature of anatomical variations displayed by the facial nerve.
Roelofs, Renée L; Wingbermühle, Ellen; Freriks, Kim; Verhaak, Chris M; Kessels, Roy P C; Egger, Jos I M
2015-04-01
Noonan syndrome (NS) and Turner syndrome (TS) are associated with cognitive problems and difficulties in affective information processing. While both phenotypes include short stature, facial dysmorphisms, and a webbed neck, genetic etiology and neuropsychological phenotype differ significantly. The present study examines putative differences in affective information processing and social assertiveness between adult women with NS and TS. Twenty-six women with NS, 40 women with TS, and 40 female controls were matched on age and intelligence, and subsequently compared on (1) alexithymia, measured by the Bermond-Vorst Alexithymia Questionnaire, (2) emotion perception, evaluated by the Emotion Recognition Task, and (3) social assertiveness and social discomfort, assessed by the Scale for Interpersonal Behavior. Women with TS showed higher levels of alexithymia than women with NS and controls (P-values < 0.001), whereas women with NS had more trouble recognizing angry facial expressions in comparison with controls (P = 0.01). No significant group differences were found for the frequency of social assertiveness and the level of social discomfort. Women with NS and TS demonstrated different patterns of impairment in affective information processing, in terms of alexithymia and emotion perception. The present findings suggest neuropsychological phenotyping to be helpful for the diagnosis of specific cognitive-affective deficits in genetic syndromes, for the enhancement of genetic counseling, and for the development of personalized treatment plans. © 2015 Wiley Periodicals, Inc.
Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian
2012-01-01
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions. PMID:22438875
Negative ion treatment increases positive emotional processing in seasonal affective disorder.
Harmer, C J; Charles, M; McTavish, S; Favaron, E; Cowen, P J
2012-08-01
Antidepressant drug treatments increase the processing of positive compared to negative affective information early in treatment. Such effects have been hypothesized to play a key role in the development of later therapeutic responses to treatment. However, it is unknown whether these effects are a common mechanism of action for different treatment modalities. High-density negative ion (HDNI) treatment is an environmental manipulation that has efficacy in randomized clinical trials in seasonal affective disorder (SAD). The current study investigated whether a single session of HDNI treatment could reverse negative affective biases seen in seasonal depression using a battery of emotional processing tasks in a double-blind, placebo-controlled randomized study. Under placebo conditions, participants with seasonal mood disturbance showed reduced recognition of happy facial expressions, increased recognition memory for negative personality characteristics and increased vigilance to masked presentation of negative words in a dot-probe task compared to matched healthy controls. Negative ion treatment increased the recognition of positive compared to negative facial expression and improved vigilance to unmasked stimuli across participants with seasonal depression and healthy controls. Negative ion treatment also improved recognition memory for positive information in the SAD group alone. These effects were seen in the absence of changes in subjective state or mood. These results are consistent with the hypothesis that early change in emotional processing may be an important mechanism for treatment action in depression and suggest that these effects are also apparent with negative ion treatment in seasonal depression.
Age-Related Changes in the Processing of Emotional Faces in a Dual-Task Paradigm.
Casares-Guillén, Carmen; García-Rodríguez, Beatriz; Delgado, Marisa; Ellgring, Heiner
2016-01-01
Background/ Study Context: Age-related changes appear to affect the ability to identify emotional facial expressions in dual-task conditions (i.e., while simultaneously performing a second visual task). The level of interference generated by the secondary task depends on the phase of emotional processing affected by the interference and the nature of the secondary task. The aim of the present study was to investigate the effect of these variables on age-related changes in the processing of emotional faces. The identification of emotional facial expressions (EFEs) was assessed in a dual-task paradigm using the following variables: (a) the phase during which interference was applied (encoding vs. retrieval phase); and (b) the nature of the interfering stimulus (visuospatial vs. verbal). The sample population consisted of 24 healthy aged adults (mean age = 75.38) and 40 younger adults (mean age = 26.90). The accuracy of EFE identification was calculated for all experimental conditions. Consistent with our hypothesis, the performance of the older group was poorer than that of the younger group in all experimental conditions. Dual-task performance was poorer when the interference occurred during the encoding phase of emotional face processing and when both tasks were of the same nature (i.e., when the experimental condition was more demanding in terms of attention). These results provide empirical evidence of age-related deficits in the identification of emotional facial expressions, which may be partially explained by the impairment of cognitive resources specific to this task. These findings may account for the difficulties experienced by the elderly during social interactions that require the concomitant processing of emotional and environmental information.
Socolovsky, Mariano; Páez, Miguel Domínguez; Masi, Gilda Di; Molina, Gonzalo; Fernández, Eduardo
2012-01-01
Background: Idiopathic facial nerve palsy (Bell's palsy) is a very common condition that affects active population. Despite its generally benign course, a minority of patients can remain with permanent and severe sequelae, including facial palsy or dyskinesia. Hypoglossal to facial nerve anastomosis is rarely used to reinnervate the mimic muscle in these patients. In this paper, we present a case where a direct partial hypoglossal to facial nerve transfer was used to reinnervate the upper and lower face. We also discuss the indications of this procedure. Case Description: A 53-year-old woman presenting a spontaneous complete (House and Brackmann grade 6) facial palsy on her left side showed no improvement after 13 months of conservative treatment. Electromyography (EMG) showed complete denervation of the mimic muscles. A direct partial hypoglossal to facial nerve anastomosis was performed, including dissection of the facial nerve at the fallopian canal. One year after the procedure, the patient showed House and Brackmann grade 3 function in her affected face. Conclusions: Partial hypoglossal–facial anastomosis with intratemporal drilling of the facial nerve is a viable technique in the rare cases in which severe Bell's palsy does not recover spontaneously. Only carefully selected patients can really benefit from this technique. PMID:22574255
Implicit attentional bias for facial emotion in dissociative seizures: Additional evidence.
Pick, Susannah; Mellers, John D C; Goldstein, Laura H
2018-03-01
This study sought to extend knowledge about the previously reported preconscious attentional bias (AB) for facial emotion in patients with dissociative seizures (DS) by exploring whether the finding could be replicated, while controlling for concurrent anxiety, depression, and potentially relevant cognitive impairments. Patients diagnosed with DS (n=38) were compared with healthy controls (n=43) on a pictorial emotional Stroop test, in which backwardly masked emotional faces (angry, happy, neutral) were processed implicitly. The group with DS displayed a significantly greater AB to facial emotion relative to controls; however, the bias was not specific to negative or positive emotions. The group effect could not be explained by performance on standardized cognitive tests or self-reported depression/anxiety. The study provides additional evidence of a disproportionate and automatic allocation of attention to facial affect in patients with DS, including both positive and negative facial expressions. Such a tendency could act as a predisposing factor for developing DS initially, or may contribute to triggering individuals' seizures on an ongoing basis. Psychological interventions such as Cognitive Behavioral Therapy (CBT) or AB modification might be suitable approaches to target this bias in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna
2010-01-01
Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…
Operant conditioning of facial displays of pain.
Kunz, Miriam; Rainville, Pierre; Lautenbacher, Stefan
2011-06-01
The operant model of chronic pain posits that nonverbal pain behavior, such as facial expressions, is sensitive to reinforcement, but experimental evidence supporting this assumption is sparse. The aim of the present study was to investigate in a healthy population a) whether facial pain behavior can indeed be operantly conditioned using a discriminative reinforcement schedule to increase and decrease facial pain behavior and b) to what extent these changes affect pain experience indexed by self-ratings. In the experimental group (n = 29), the participants were reinforced every time that they showed pain-indicative facial behavior (up-conditioning) or a neutral expression (down-conditioning) in response to painful heat stimulation. Once facial pain behavior was successfully up- or down-conditioned, respectively (which occurred in 72% of participants), facial pain displays and self-report ratings were assessed. In addition, a control group (n = 11) was used that was yoked to the reinforcement plans of the experimental group. During the conditioning phases, reinforcement led to significant changes in facial pain behavior in the majority of the experimental group (p < .001) but not in the yoked control group (p > .136). Fine-grained analyses of facial muscle movements revealed a similar picture. Furthermore, the decline in facial pain displays (as observed during down-conditioning) strongly predicted changes in pain ratings (R(2) = 0.329). These results suggest that a) facial pain displays are sensitive to reinforcement and b) that changes in facial pain displays can affect self-report ratings.
... hours to days Facial droop and difficulty making facial expressions, such as closing your eye or smiling Drooling Pain around the jaw or in or behind your ear on the affected side Increased ... if you experience facial weakness or drooping to determine the underlying cause ...
How facial attractiveness affects sustained attention.
Li, Jie; Oksama, Lauri; Hyönä, Jukka
2016-10-01
The present study investigated whether and how facial attractiveness affects sustained attention. We adopted a multiple-identity tracking paradigm, using attractive and unattractive faces as stimuli. Participants were required to track moving target faces amid distractor faces and report the final location of each target. In Experiment 1, the attractive and unattractive faces differed in both the low-level properties (i.e., luminance, contrast, and color saturation) and high-level properties (i.e., physical beauty and age). The results showed that the attractiveness of both the target and distractor faces affected the tracking performance: The attractive target faces were tracked better than the unattractive target faces; when the targets and distractors were both unattractive male faces, the tracking performance was poorer than when they were of different attractiveness. In Experiment 2, the low-level properties of the facial images were equalized. The results showed that the attractive target faces were still tracked better than unattractive targets while the effects related to distractor attractiveness ceased to exist. Taken together, the results indicate that during attentional tracking the high-level properties related to the attractiveness of the target faces can be automatically processed, and then they can facilitate the sustained attention on the attractive targets, either with or without the supplement of low-level properties. On the other hand, only low-level properties of the distractor faces can be processed. When the distractors share similar low-level properties with the targets, they can be grouped together, so that it would be more difficult to sustain attention on the individual targets. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Compensating for age limits through emotional crossmodal integration
Chaby, Laurence; Boullay, Viviane Luherne-du; Chetouani, Mohamed; Plaza, Monique
2015-01-01
Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults. PMID:26074845
de la Rosa, Stephan; Fademrecht, Laura; Bülthoff, Heinrich H; Giese, Martin A; Curio, Cristóbal
2018-06-01
Motor-based theories of facial expression recognition propose that the visual perception of facial expression is aided by sensorimotor processes that are also used for the production of the same expression. Accordingly, sensorimotor and visual processes should provide congruent emotional information about a facial expression. Here, we report evidence that challenges this view. Specifically, the repeated execution of facial expressions has the opposite effect on the recognition of a subsequent facial expression than the repeated viewing of facial expressions. Moreover, the findings of the motor condition, but not of the visual condition, were correlated with a nonsensory condition in which participants imagined an emotional situation. These results can be well accounted for by the idea that facial expression recognition is not always mediated by motor processes but can also be recognized on visual information alone.
The effect of Ramadan fasting on spatial attention through emotional stimuli
Molavi, Maziyar; Yunus, Jasmy; Utama, Nugraha P
2016-01-01
Fasting can influence psychological and mental states. In the current study, the effect of periodical fasting on the process of emotion through gazed facial expression as a realistic multisource of social information was investigated for the first time. The dynamic cue-target task was applied via behavior and event-related potential measurements for 40 participants to reveal the temporal and spatial brain activities – before, during, and after fasting periods. The significance of fasting included several effects. The amplitude of the N1 component decreased over the centroparietal scalp during fasting. Furthermore, the reaction time during the fasting period decreased. The self-measurement of deficit arousal as well as the mood increased during the fasting period. There was a significant contralateral alteration of P1 over occipital area for the happy facial expression stimuli. The significant effect of gazed expression and its interaction with the emotional stimuli was indicated by the amplitude of N1. Furthermore, the findings of the study approved the validity effect as a congruency between gaze and target position, as indicated by the increment of P3 amplitude over centroparietal area as well as slower reaction time from behavioral response data during incongruency or invalid condition between gaze and target position compared with those during valid condition. Results of this study proved that attention to facial expression stimuli as a kind of communicative social signal was affected by fasting. Also, fasting improved the mood of practitioners. Moreover, findings from the behavioral and event-related potential data analyses indicated that the neural dynamics of facial emotion are processed faster than that of gazing, as the participants tended to react faster and prefer to relay on the type of facial emotions than to gaze direction while doing the task. Because of happy facial expression stimuli, right hemisphere activation was more than that of the left hemisphere. It indicated the consistency of the emotional lateralization concept rather than the valence concept of emotional processing. PMID:27307772
Neuropsychological Studies of Linguistic and Affective Facial Expressions in Deaf Signers.
ERIC Educational Resources Information Center
Corina, David P.; Bellugi, Ursula; Reilly, Judy
1999-01-01
Presents two studies that explore facial expression production in deaf signers. An experimental paradigm uses chimeric stimuli of American Sign Language linguistic and facial expressions to explore patterns of productive asymmetries in brain-intact signers. (Author/VWL)
Fanti, Kostas A; Panayiotou, Georgia; Lombardo, Michael V; Kyranides, Melina Nicole
2016-01-01
The current study aimed to identify atypical neurophysiological activity associated with deficient affective processing in individuals with high callous-unemotional traits (CU). Fifty-six participants (M age = 20.52; 46% male) divided in two groups, differentiated on levels of CU traits, were invited to participate in the experimental phase of the study. Medial prefrontal cortex activity, measured with functional Near-Infrared Spectroscopy, and facial electro-myography activity were recorded during videos depicting violent, comedy and neutral scenes. Individuals high on CU traits showed similar medial prefrontal cortex oxygenated hemoglobin (HbO(2)) activity to positive and negative films, while the pre-frontal cortical responses of low CU individuals were more pronounced to positive than negative materials. High CU participants also showed reduced facial electromyography at the corrugator muscle in response to violent films, which was not differentiated from their responses to comedy films. These findings suggest that individuals high on CU traits show reduced but not absent (i.e., flat) affect to emotional material. Deficits in processing positive and negative valent material, measured with different neuro-physiological modalities, might be essential to understand CU traits.
ERIC Educational Resources Information Center
Herridge, Matt L.; Harrison, David W.; Mollet, Gina A.; Shenal, Brian V.
2004-01-01
The effects of hostility and a cold pressor stressor on the accuracy of facial affect perception were examined in the present experiment. A mechanism whereby physiological arousal level is mediated by systems which also mediate accuracy of an individual's interpretation of affective cues is described. Right-handed participants were classified as…
NK1 receptor antagonism and emotional processing in healthy volunteers.
Chandra, P; Hafizi, S; Massey-Chase, R M; Goodwin, G M; Cowen, P J; Harmer, C J
2010-04-01
The neurokinin-1 (NK(1)) receptor antagonist, aprepitant, showed activity in several animal models of depression; however, its efficacy in clinical trials was disappointing. There is little knowledge of the role of NK(1) receptors in human emotional behaviour to help explain this discrepancy. The aim of the current study was to assess the effects of a single oral dose of aprepitant (125 mg) on models of emotional processing sensitive to conventional antidepressant drug administration in 38 healthy volunteers, randomly allocated to receive aprepitant or placebo in a between groups double blind design. Performance on measures of facial expression recognition, emotional categorisation, memory and attentional visual-probe were assessed following the drug absorption. Relative to placebo, aprepitant improved recognition of happy facial expressions and increased vigilance to emotional information in the unmasked condition of the visual probe task. In contrast, aprepitant impaired emotional memory and slowed responses in the facial expression recognition task suggesting possible deleterious effects on cognition. These results suggest that while antagonism of NK(1) receptors does affect emotional processing in humans, its effects are more restricted and less consistent across tasks than those of conventional antidepressants. Human models of emotional processing may provide a useful means of assessing the likely therapeutic potential of new treatments for depression.
Emotion identification and aging: Behavioral and neural age-related changes.
Gonçalves, Ana R; Fernandes, Carina; Pasion, Rita; Ferreira-Santos, Fernando; Barbosa, Fernando; Marques-Teixeira, João
2018-05-01
Aging is known to alter the processing of facial expressions of emotion (FEE), however the impact of this alteration is less clear. Additionally, there is little information about the temporal dynamics of the neural processing of facial affect. We examined behavioral and neural age-related changes in the identification of FEE using event-related potentials. Furthermore, we analyze the relationship between behavioral/neural responses and neuropsychological functioning. To this purpose, 30 younger adults, 29 middle-aged adults and 26 older adults identified FEE. The behavioral results showed a similar performance between groups. The neural results showed no significant differences between groups for the P100 component and an increased N170 amplitude in the older group. Furthermore, a pattern of asymmetric activation was evident in the N170 component. Results also suggest deficits in facial feature decoding abilities, reflected by a reduced N250 amplitude in older adults. Neuropsychological functioning predicts P100 modulation, but does not seem to influence emotion identification ability. The findings suggest the existence of a compensatory function that would explain the age-equivalent performance in emotion identification. The study may help future research addressing behavioral and neural processes involved on processing of FEE in neurodegenerative conditions. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Koutsis, Georgios; Kokotis, Panagiotis; Papagianni, Aikaterini E; Evangelopoulos, Maria-Eleftheria; Kilidireas, Constantinos; Karandreas, Nikolaos
2016-09-01
To integrate neurophysiological findings with clinical and imaging data in a consecutive series of multiple sclerosis (MS) patients developing facial numbness during the course of an MS attack. Nine consecutive patients with MS and recent-onset facial numbness were studied clinically, imaged with routine MRI, and assessed neurophysiologically with trigeminal somatosensory evoked potential (TSEP), blink reflex (BR), masseter reflex (MR), facial nerve conduction, facial muscle and masseter EMG studies. All patients had unilateral facial hypoesthesia on examination and lesions in the ipsilateral pontine tegmentum on MRI. All patients had abnormal TSEPs upon stimulation of the affected side, excepting one that was tested following remission of numbness. BR was the second most sensitive neurophysiological method with 6/9 examinations exhibiting an abnormal R1 component. The MR was abnormal in 3/6 patients, always on the affected side. Facial conduction and EMG studies were normal in all patients but one. Facial numbness was always related to abnormal TSEPs. A concomitant R1 abnormality on BR allowed localization of the responsible pontine lesion, which closely corresponded with MRI findings. We conclude that neurophysiological assessment of MS patients with facial numbness is a sensitive tool, which complements MRI, and can improve lesion localization. Copyright © 2016 Elsevier B.V. All rights reserved.
Di Cerbo, Alessandro; Laurino, Carmen; Palmieri, Beniamino; Iannitti, Tommaso
2015-03-01
Excessive exposure to the sun can cause severe photoaging as early as the second decade of life resulting in a loss of physiological elastic fiber functions. We designed a first study to assess differences in facial skin pH, sebum, elasticity, hydration and tonicity and serum levels of fibronectin, elastin, neutrophil elastase 2, hyaluronic acid and carbonylated proteins between patients affected by facial photoaging and healthy controls. In a second study we tested the hypothesis that a dietary supplement would improve facial photoaging, also promoting changes in the above mentioned skin and serum parameters. In the first study we enrolled 30 women [age: 47.5 ± 1.6 years (mean ± standard error of the mean)] affected by moderate facial photoaging (4 cm ≤ Visual Analogue Scale (VAS)<7 cm) and 30 healthy women [age: 45.9 ± 1.6 years (mean ± standard error of the mean)]. In the second study we enrolled a cohort of 30 women [age: 43.6 ± 1.2 years (mean ± standard error of the mean)], affected by moderate (n = 22) and severe (VAS ≥ 7 cm; n = 8) facial photoaging, who were randomized to receive a pharmaceutical formulation (VISCODERM Pearls; IBSA FARMACEUTICI ITALIA Srl, Lodi, Italy) containing Pycnogenol, collagen, coenzyme Q10, low-molecular-weight hyaluronic acid, chondroitin sulfate and glucosamine sulfate (n = 15) or placebo (n = 15). Dietary supplement and placebo were administered 2 times a day for 4 weeks. Facial photoaging was assessed by VAS in the first cohort of patients affected by facial photoaging and healthy controls and, at baseline and 2 weeks after the end of treatment, in the second cohort of patients who underwent treatment with VISCODERM Pearls and placebo. Skin Tester was used to analyze differences in facial skin parameters between patients affected by facial photoaging and healthy controls. Skin Tester was also used to assess the effect of VISCODERM Pearls on facial skin parameters and compared with placebo 2 weeks after the end of treatment. Serum levels of fibronectin, elastin, neutrophil elastase 2, hyaluronic acid and carbonylated proteins were measured by enzyme-linked immunosorbent assay in the first cohort of patients affected by facial photoaging and healthy controls and, at baseline and 2 weeks after the end of treatment, in the second cohort of patients who underwent treatment with VISCODERM Pearls and placebo. VAS photoaging score was higher in patients affected by photoaging, if compared with healthy controls (p < 0.0001). pH and sebum were increased in patients affected by photoaging, if compared with healthy controls (both p < 0.0001), while elasticity, hydration and tonicity were decreased in patients affected by photoaging, if compared with healthy controls (all p < 0.0001). Serum fibronectin and hyaluronic acid concentrations were lower in patients affected by photoaging, if compared with healthy controls (both p < 0.0001). Serum neutrophil elastase 2, elastin and carbonylated protein concentrations were higher in patients affected by photoaging, if compared with healthy controls (p < 0.01, p < 0.01 and p < 0.0001, respectively). Dietary supplement administration resulted in an improvement in VAS photoaging score, if compared with placebo (p < 0.0001), as observed 2 weeks after the end of treatment. Facial sebum, hydration and tonicity were increased in the active treatment group vs. placebo (p < 0.0001, p < 0.0001 and p < 0.05, respectively) 2 weeks after the end of treatment. Serum fibronectin and hyaluronic acid concentrations were increased in the dietary supplement group, if compared with placebo (p < 0.01 and p < 0.001) 2 weeks after the end of treatment, while no statistical difference in serum elastin concentration was observed between the two groups. Serum neutrophil elastase 2 and carbonylated protein concentrations were decreased in the dietary supplement group 2 weeks after the end of treatment, if compared with placebo (p < 0.001 and p < 0.0001). We found significantly increased serum levels of neutrophil elastase 2, elastin and carbonylated proteins and decreased levels of hyaluronic acid and fibronectin in patients affected by facial photoaging, if compared with healthy controls. These findings coupled with a significant decrease in skin hydration, tonicity and elasticity and increased skin pH and sebum. Treatment with the dietary supplement VISCODERM Pearls significantly improved VAS photoaging score and skin hydration, sebum and tonicity 2 weeks after the end of a 4-week treatment period in patients affected by moderate to severe facial photoaging. These findings coupled with a significant increase in serum fibronectin and hyaluronic acid and a decrease in serum carbonylated proteins and neutrophil elastase 2 in the active treatment group, if compared with placebo. Our findings suggest that VISCODERM Pearls is effective for treatment of facial photoaging but further studies in larger cohorts of patients are required. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Proposed shade guide for human facial skin and lip: a pilot study.
Wee, Alvin G; Beatty, Mark W; Gozalo-Diaz, David J; Kim-Pusateri, Seungyee; Marx, David B
2013-08-01
Currently, no commercially available facial shade guide exists in the United States for the fabrication of facial prostheses. The purpose of this study was to measure facial skin and lip color in a human population sample stratified by age, gender, and race. Clustering analysis was used to determine optimal color coordinates for a proposed facial shade guide. Participants (n=119) were recruited from 4 racial/ethnic groups, 5 age groups, and both genders. Reflectance measurements of participants' noses and lower lips were made by using a spectroradiometer and xenon arc lamp with a 45/0 optical configuration. Repeated measures ANOVA (α=.05), to identify skin and lip color differences, resulting from race, age, gender, and location, and a hierarchical clustering analysis, to identify clusters of skin colors) were used. Significant contributors to L*a*b* facial color were race and facial location (P<.01). b* affected all factors (P<.05). Age affected only b* (P<.001), while gender affected only L* (P<.05) and b* (P<.05). Analyses identified 5 clusters of skin color. The study showed that skin color caused by age and gender primarily occurred within the yellow-blue axis. A significant lightness difference between gender groups was also found. Clustering analysis identified 5 distinct skin shade tabs. Copyright © 2013 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Pace, Michela; Cioffi, Iacopo; D'antò, Vincenzo; Valletta, Alessandra; Valletta, Rosa; Amato, Massimo
2018-06-01
Physical attractiveness is dependent on facial appearance. The facial profile plays a crucial role in facial attractiveness and can be improved with orthodontic treatment. The aesthetic assessment of facial appearance may be influenced by the cultural background and education of the assessor and dependent upon the experience level of dental professionals. This study aimed to evaluate how the sagittal jaw relationship in Class I and Class II individuals affects facial attractiveness, and whether the assessor's professional education and background affect the perception of facial attractiveness. Facial silhouettes simulating mandibular retrusion, maxillary protrusion, mandibular retrusion combined with maxillary protrusion, bimaxillary protrusion and severe bimaxillary protrusion in class I and class II patients were assessed by five groups of people with different backgrounds and education levels (i.e., 23 expert orthodontists, 21 orthodontists, 15 maxillofacial surgeons, 19 orthodontic patients and 28 laypeople). Straight facial profiles were judged to be more attractive than convex profiles due to severe mandibular retrusion and to mandibular retrusion combined with maxillary protrusion (all P<0.05). Convex profiles due to a slightly retruded position of the mandible were judged less attractive by clinicians than by patients and laypeople (all P<0.05). Convex facial profiles are less attractive than Class I profiles. The assessment of facial attractiveness is dependent on the assessor's education and background. Laypeople and patients are considerably less sensitive to abnormal sagittal jaw relationships than orthodontists.
Morphological Integration of Soft-Tissue Facial Morphology in Down Syndrome and Siblings
Starbuck, John; Reeves, Roger H.; Richtsmeier, Joan
2011-01-01
Down syndrome (DS), resulting from trisomy of chromosome 21, is the most common live-born human aneuploidy. The phenotypic expression of trisomy 21 produces variable, though characteristic, facial morphology. Although certain facial features have been documented quantitatively and qualitatively as characteristic of DS (e.g., epicanthic folds, macroglossia, and hypertelorism), all of these traits occur in other craniofacial conditions with an underlying genetic cause. We hypothesize that the typical DS face is integrated differently than the face of non-DS siblings, and that the pattern of morphological integration unique to individuals with DS will yield information about underlying developmental associations between facial regions. We statistically compared morphological integration patterns of immature DS faces (N = 53) with those of non-DS siblings (N = 54), aged 6–12 years using 31 distances estimated from 3D coordinate data representing 17 anthropometric landmarks recorded on 3D digital photographic images. Facial features are affected differentially in DS, as evidenced by statistically significant differences in integration both within and between facial regions. Our results suggest a differential affect of trisomy on facial prominences during craniofacial development. PMID:21996933
Morphological integration of soft-tissue facial morphology in Down Syndrome and siblings.
Starbuck, John; Reeves, Roger H; Richtsmeier, Joan
2011-12-01
Down syndrome (DS), resulting from trisomy of chromosome 21, is the most common live-born human aneuploidy. The phenotypic expression of trisomy 21 produces variable, though characteristic, facial morphology. Although certain facial features have been documented quantitatively and qualitatively as characteristic of DS (e.g., epicanthic folds, macroglossia, and hypertelorism), all of these traits occur in other craniofacial conditions with an underlying genetic cause. We hypothesize that the typical DS face is integrated differently than the face of non-DS siblings, and that the pattern of morphological integration unique to individuals with DS will yield information about underlying developmental associations between facial regions. We statistically compared morphological integration patterns of immature DS faces (N = 53) with those of non-DS siblings (N = 54), aged 6-12 years using 31 distances estimated from 3D coordinate data representing 17 anthropometric landmarks recorded on 3D digital photographic images. Facial features are affected differentially in DS, as evidenced by statistically significant differences in integration both within and between facial regions. Our results suggest a differential affect of trisomy on facial prominences during craniofacial development. 2011 Wiley Periodicals, Inc.
Affective indicators of the psychotherapeutic process: an empirical case study.
Dreher, M; Mengele, U; Krause, R; Kämmerer, A
2001-03-01
By analyzing facial expressions of emotion and the emotional experience of a patient and a psychotherapist, we attempted to objectively register unconscious interaction processes that could have contributed to the failure of a psychotherapy that ended prematurely. In this connection, the affect 'contempt' played a particular role. It is made clear how an unconscious enactment results in a gap between emotional expression and experience. In addition, the countertransference of the psychotherapist is examined and the emotional experience is contrasted with her affective behavior. In this study, it is demonstrated how this particular psychotherapy failed due to a lack of acknowledging the involvement of the interactive dynamics.
Altering sensorimotor feedback disrupts visual discrimination of facial expressions.
Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula
2016-08-01
Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.
Late revision or correction of facial trauma-related soft-tissue deformities.
Rieck, Kevin L; Fillmore, W Jonathan; Ettinger, Kyle S
2013-11-01
Surgical approaches used in accessing the facial skeleton for fracture repair are often the same as or similar to those used for cosmetic enhancement of the face. Rarely does facial trauma result in injuries that do not in some way affect the facial soft-tissue envelope either directly or as sequelae of the surgical repair. Knowledge of both skeletal and facial soft-tissue anatomy is paramount to successful clinical outcomes. Facial soft-tissue deformities can arise that require specific evaluation and management for correction. This article focuses on revision and correction of these soft-tissue-related injuries secondary to facial trauma. Copyright © 2013. Published by Elsevier Inc.
Cephalometric features in isolated growth hormone deficiency.
Oliveira-Neto, Luiz Alves; Melo, Mariade de Fátima B; Franco, Alexandre A; Oliveira, Alaíde H A; Souza, Anita H O; Valença, Eugênia H O; Britto, Isabela M P A; Salvatori, Roberto; Aguiar-Oliveira, Manuel H
2011-07-01
To analyze cephalometric features in adults with isolated growth hormone (GH) deficiency (IGHD). Nine adult IGHD individuals (7 males and 2 females; mean age, 37.8 ± 13.8 years) underwent a cross-sectional cephalometric study, including 9 linear and 5 angular measurements. Posterior facial height/anterior facial height and lower-anterior facial height/anterior facial height ratios were calculated. To pool cephalometric measurements in both genders, results were normalized by standard deviation scores (SDS), using the population means from an atlas of the normal Brazilian population. All linear measurements were reduced in IGHD subjects. Total maxillary length was the most reduced parameter (-6.5 ± 1.7), followed by a cluster of six measurements: posterior cranial base length (-4.9 ± 1.1), total mandibular length (-4.4 ± 0.7), total posterior facial height (-4.4 ± 1.1), total anterior facial height (-4.3 ± 0.9), mandibular corpus length (-4.2 ± 0.8), and anterior cranial base length (-4.1 ± 1.7). Less affected measurements were lower-anterior facial height (-2.7 ± 0.7) and mandibular ramus height (-2.5 ± 1.5). SDS angular measurements were in the normal range, except for increased gonial angle (+2.5 ± 1.1). Posterior facial height/anterior facial height and lower-anterior facial height/anterior facial height ratios were not different from those of the reference group. Congenital, untreated IGHD causes reduction of all linear measurements of craniofacial growth, particularly total maxillary length. Angular measurements and facial height ratios are less affected, suggesting that lGHD causes proportional blunting of craniofacial growth.
Tryptophan depletion decreases the recognition of fear in female volunteers.
Harmer, C J; Rogers, R D; Tunbridge, E; Cowen, P J; Goodwin, G M
2003-06-01
Serotonergic processes have been implicated in the modulation of fear conditioning in humans, postulated to occur at the level of the amygdala. The processing of other fear-relevant cues, such as facial expressions, has also been associated with amygdala function, but an effect of serotonin depletion on these processes has not been assessed. The present study investigated the effects of reducing serotonin function, using acute tryptophan depletion, on the recognition of basic facial expressions of emotions in healthy male and female volunteers. A double-blind between-groups design was used, with volunteers being randomly allocated to receive an amino acid drink specifically lacking tryptophan or a control mixture containing a balanced mixture of these amino acids. Participants were given a facial expression recognition task 5 h after drink administration. This task featured examples of six basic emotions (fear, anger, disgust, surprise, sadness and happiness) that had been morphed between each full emotion and neutral in 10% steps. As a control, volunteers were given a famous face classification task matched in terms of response selection and difficulty level. Tryptophan depletion significantly impaired the recognition of fearful facial expressions in female, but not male, volunteers. This was specific since recognition of other basic emotions was comparable in the two groups. There was also no effect of tryptophan depletion on the classification of famous faces or on subjective state ratings of mood or anxiety. These results confirm a role for serotonin in the processing of fear related cues, and in line with previous findings also suggest greater effects of tryptophan depletion in female volunteers. Although acute tryptophan depletion does not typically affect mood in healthy subjects, the present results suggest that subtle changes in the processing of emotional material may occur with this manipulation of serotonin function.
Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L
2011-09-01
There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.
[Magnetic resonance imaging in facial injuries and digital fusion CT/MRI].
Kozakiewicz, Marcin; Olszycki, Marek; Arkuszewski, Piotr; Stefańczyk, Ludomir
2006-01-01
Magnetic resonance images [MRI] and their digital fusion with computed tomography [CT] data, observed in patients affected with facial injuries, are presented in this study. The MR imaging of 12 posttraumatic patients was performed in the same plains as their previous CT scans. Evaluation focused on quality of the facial soft tissues depicting, which was unsatisfactory in CT. Using the own "Dental Studio" programme the digital fusion of the both modalities was performed. Pathologic dislocations and injures of facial soft tissues are visualized better in MRI than in CT examination. Especially MRI properly reveals disturbances in intraorbital soft structures. MRI-based assessment is valuable in patients affected with facial soft tissues injuries, especially in case of orbita/sinuses hernia. Fusion CT/MRI scans allows to evaluate simultaneously bone structure and soft tissues of the same region.
Corneanu, Ciprian Adrian; Simon, Marc Oliu; Cohn, Jeffrey F; Guerrero, Sergio Escalera
2016-08-01
Facial expressions are an important way through which humans interact socially. Building a system capable of automatically recognizing facial expressions from images and video has been an intense field of study in recent years. Interpreting such expressions remains challenging and much research is needed about the way they relate to human affect. This paper presents a general overview of automatic RGB, 3D, thermal and multimodal facial expression analysis. We define a new taxonomy for the field, encompassing all steps from face detection to facial expression recognition, and describe and classify the state of the art methods accordingly. We also present the important datasets and the bench-marking of most influential methods. We conclude with a general discussion about trends, important questions and future lines of research.
Recognition of schematic facial displays of emotion in parents of children with autism.
Palermo, Mark T; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria
2006-07-01
Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. 'Autism fathers' performed worse than 'autism mothers', who performed worse than controls in decoding displays representing sadness or disgust. This indicates the need to include facial expression decoding tasks in genetic research of autism. In addition, emotional expression interactions between parents and their children with autism, particularly through play, where affect and prosody are 'physiologically' exaggerated, may stimulate development of social competence. Future studies could benefit from a combination of stimuli including photographs and schematic drawings, with and without associated verbal categories. This may allow the subdivision of patients and relatives on the basis of the amount of information needed to understand and process social-emotionally relevant information.
Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton
2015-02-01
The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.
The influence of attention toward facial expressions on size perception.
Choi, Jeong-Won; Kim, Kiho; Lee, Jang-Han
2016-01-01
According to the New Look theory, size perception is affected by emotional factors. Although previous studies have attempted to explain the effects of both emotion and motivation on size perception, they have failed to identify the underlying mechanisms. This study aimed to investigate the underlying mechanisms of size perception by applying attention toward facial expressions using the Ebbinghaus illusion as a measurement tool. The participants, female university students, were asked to judge the size of a target stimulus relative to the size of facial expressions (i.e., happy, angry, and neutral) surrounding the target. The results revealed that the participants perceived angry and neutral faces to be larger than happy faces. This finding indicates that individuals pay closer attention to neutral and angry faces than happy ones. These results suggest that the mechanisms underlying size perception involve cognitive processes that focus attention toward relevant stimuli and block out irrelevant stimuli.
Quantifying facial expression recognition across viewing conditions.
Goren, Deborah; Wilson, Hugh R
2006-04-01
Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.
Prosopo-affective agnosia associated with chronic organic brain syndrome.
Kurucz, J; Feldmar, G; Werner, W
1979-02-01
Impairment of the ability to recognize facially expressed emotions was studied in 14 chronically disoriented patients with chronic organic brain syndrome (CBS). This impairment was named prosopo-affective agnosia (PAA). A diagnostic requirement was relatively intact neurologic functioning in underlying perceptual-verbal-motor processing. A test was designed for facial-affect recognition in the accurate differentiation of normal persons from chronically disoriented CBS patients. No normal subject made any errors in this test. Despite decades of illness and hospital living, patients with a history of schizophrenia or major affective disorders scored almost at a normal level (95 vs. 100 percent) in this test, and significantly higher (95 vs. 66 percent) than did the disoriented CBS patients. The social and therapeutic implications of the findings are stressed. CBS patients may be impaired with respect to receiving and appreciating elementary aspects of social communications such as recognizing a smile, anger, sadness or disapproval on the faces of people who surround them. This disability requires understanding and a special attitude on the part of the therapeutic team toward such patients.
Kempton, Matthew J; Haldane, Morgan; Jogia, Jigar; Christodoulou, Tessa; Powell, John; Collier, David; Williams, Steven C R; Frangou, Sophia
2009-04-01
The functional catechol-O-methyltransferase (COMT Val108/158Met) polymorphism has been shown to have an impact on tasks of executive function, memory and attention and recently, tasks with an affective component. As oestrogen reduces COMT activity, we focused on the interaction between gender and COMT genotype on brain activations during an affective processing task. We used functional MRI (fMRI) to record brain activations from 74 healthy subjects who engaged in a facial affect recognition task; subjects viewed and identified fearful compared to neutral faces. There was no main effect of the COMT polymorphism, gender or genotypexgender interaction on task performance. We found a significant effect of gender on brain activations in the left amygdala and right temporal pole, where females demonstrated increased activations over males. Within these regions, Val/Val carriers showed greater signal magnitude compared to Met/Met carriers, particularly in females. The COMT Val108/158Met polymorphism impacts on gender-related patterns of activation in limbic and paralimbic regions but the functional significance of any oestrogen-related COMT inhibition appears modest.
On the facilitative effects of face motion on face recognition and its development
Xiao, Naiqi G.; Perrotta, Steve; Quinn, Paul C.; Wang, Zhe; Sun, Yu-Hao P.; Lee, Kang
2014-01-01
For the past century, researchers have extensively studied human face processing and its development. These studies have advanced our understanding of not only face processing, but also visual processing in general. However, most of what we know about face processing was investigated using static face images as stimuli. Therefore, an important question arises: to what extent does our understanding of static face processing generalize to face processing in real-life contexts in which faces are mostly moving? The present article addresses this question by examining recent studies on moving face processing to uncover the influence of facial movements on face processing and its development. First, we describe evidence on the facilitative effects of facial movements on face recognition and two related theoretical hypotheses: the supplementary information hypothesis and the representation enhancement hypothesis. We then highlight several recent studies suggesting that facial movements optimize face processing by activating specific face processing strategies that accommodate to task requirements. Lastly, we review the influence of facial movements on the development of face processing in the first year of life. We focus on infants' sensitivity to facial movements and explore the facilitative effects of facial movements on infants' face recognition performance. We conclude by outlining several future directions to investigate moving face processing and emphasize the importance of including dynamic aspects of facial information to further understand face processing in real-life contexts. PMID:25009517
Facial blanching after inferior alveolar nerve block anesthesia: an unusual complication.
Kang, Sang-Hoon; Won, Yu-Jin
2017-12-01
The present case report describes a complication involving facial blanching symptoms occurring during inferior alveolar nerve block anesthesia (IANBA). Facial blanching after IANBA can be caused by the injection of an anesthetic into the maxillary artery area, affecting the infraorbital artery.
Recent Advances in Face Lift to Achieve Facial Balance.
Ilankovan, Velupillai
2017-03-01
Facial balance is achieved by correction of facial proportions and the facial contour. Ageing affects this balance in addition to other factors. We have strived to inform all the recent advances in providing this balance. The anatomy of ageing including various changed in clinical features are described. The procedures are explained on the basis of the upper, middle and lower face. Different face lift, neck lift procedures with innovative techniques are demonstrated. The aim is to provide an unoperated balanced facial proportion with zero complication.
Facial Expression of Affect in Children with Cornelia de Lange Syndrome
ERIC Educational Resources Information Center
Collis, L.; Moss, J.; Jutley, J.; Cornish, K.; Oliver, C.
2008-01-01
Background: Individuals with Cornelia de Lange syndrome (CdLS) have been reported to show comparatively high levels of flat and negative affect but there have been no empirical evaluations. In this study, we use an objective measure of facial expression to compare affect in CdLS with that seen in Cri du Chat syndrome (CDC) and a group of…
Kujala, Miiamaaria V; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri
2017-01-01
Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.
Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri
2017-01-01
Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335
Platt, Bradley; Kamboj, Sunjeev; Morgan, Celia J A; Curran, H Valerie
2010-11-01
While heavy cannabis-users seem to show various cognitive impairments, it remains unclear whether they also experience significant deficits in affective functioning. Evidence of such deficits may contribute to our understanding of the interpersonal difficulties in cannabis-users, and the link between cannabis-use and psychological disorders (Moore et al., 2007). Emotion recognition performance of heavy cannabis-users and non-using controls was compared. A measure of emotion recognition was used in which participants identified facial expressions as they changed from neutral (open-mouth) to gradually more intense expressions of sadness, neutral, anger or happiness (open or closed mouth). Reaction times and accuracy were recorded as the facial expressions changed. Participants also completed measures of 'theory of mind,' depression and impulsivity. Cannabis-users were significantly slower than controls at identifying all three emotional expressions. There was no difference between groups in identifying facial expressions changing from open-mouth neutral expressions to closed-mouth neutral expressions suggesting that differences in emotion recognition were not due to a general slowing of reaction times. Cannabis-users were also significantly more liberal in their response criterion for recognising sadness. Heavy cannabis-use may be associated with affect recognition deficits. In particular, a greater intensity of emotion expression was required before identification of positive and negative emotions. This was found using stimuli which simulated dynamic changes in emotion expression, and in turn, suggests that cannabis-users may experience generalised problems in decoding basic emotions during social interactions. The implications of these findings are discussed for vulnerability to psychological and interpersonal difficulties in cannabis-users. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Facial color processing in the face-selective regions: an fMRI study.
Nakajima, Kae; Minami, Tetsuto; Tanabe, Hiroki C; Sadato, Norihiro; Nakauchi, Shigeki
2014-09-01
Facial color is important information for social communication as it provides important clues to recognize a person's emotion and health condition. Our previous EEG study suggested that N170 at the left occipito-temporal site is related to facial color processing (Nakajima et al., [2012]: Neuropsychologia 50:2499-2505). However, because of the low spatial resolution of EEG experiment, the brain region is involved in facial color processing remains controversial. In the present study, we examined the neural substrates of facial color processing using functional magnetic resonance imaging (fMRI). We measured brain activity from 25 subjects during the presentation of natural- and bluish-colored face and their scrambled images. The bilateral fusiform face (FFA) area and occipital face area (OFA) were localized by the contrast of natural-colored faces versus natural-colored scrambled images. Moreover, region of interest (ROI) analysis showed that the left FFA was sensitive to facial color, whereas the right FFA and the right and left OFA were insensitive to facial color. In combination with our previous EEG results, these data suggest that the left FFA may play an important role in facial color processing. Copyright © 2014 Wiley Periodicals, Inc.
Tron, Talia; Peled, Abraham; Grinsphoon, Alexander; Weinshall, Daphna
2016-08-01
Incongruity between emotional experience and its outwardly expression is one of the prominent symptoms in schizophrenia. Though widely reported and used in clinical evaluation, this symptom is inadequately defined in the literature and may be confused with mere affect flattening. In this study we used structured-light depth camera and dedicated software to automatically measure facial activity of schizophrenia patients and healthy individuals during an emotionally evocative task. We defined novel measures for the congruence of emotional experience and emotional expression and for Flat Affect, compared them between patients and controls, and examined their consistency with clinical evaluation. We found incongruity in schizophrenia to be manifested in a less specific range of facial expressions in response to similar emotional stimuli, while the emotional experience remains intact. Our study also suggests that when taking into consideration affect flatness, no contextually inappropriate facial expressions are evident.
Fisher, Katie; Towler, John; Eimer, Martin
2016-01-08
It is frequently assumed that facial identity and facial expression are analysed in functionally and anatomically distinct streams within the core visual face processing system. To investigate whether expression and identity interact during the visual processing of faces, we employed a sequential matching procedure where participants compared either the identity or the expression of two successively presented faces, and ignored the other irrelevant dimension. Repetitions versus changes of facial identity and expression were varied independently across trials, and event-related potentials (ERPs) were recorded during task performance. Irrelevant facial identity and irrelevant expression both interfered with performance in the expression and identity matching tasks. These symmetrical interference effects show that neither identity nor expression can be selectively ignored during face matching, and suggest that they are not processed independently. N250r components to identity repetitions that reflect identity matching mechanisms in face-selective visual cortex were delayed and attenuated when there was an expression change, demonstrating that facial expression interferes with visual identity matching. These findings provide new evidence for interactions between facial identity and expression within the core visual processing system, and question the hypothesis that these two attributes are processed independently. Copyright © 2015 Elsevier Ltd. All rights reserved.
Facial blanching after inferior alveolar nerve block anesthesia: an unusual complication
2017-01-01
The present case report describes a complication involving facial blanching symptoms occurring during inferior alveolar nerve block anesthesia (IANBA). Facial blanching after IANBA can be caused by the injection of an anesthetic into the maxillary artery area, affecting the infraorbital artery. PMID:29349355
Unattractive infant faces elicit negative affect from adults
Schein, Stevie S.; Langlois, Judith H.
2015-01-01
We examined the relationship between infant attractiveness and adult affect by investigating whether differing levels of infant facial attractiveness elicit facial muscle movement correlated with positive and negative affect from adults (N = 87) using electromyography. Unattractive infant faces evoked significantly more corrugator supercilii and levator labii superioris movement (physiological correlates of negative affect) than attractive infant faces. These results suggest that unattractive infants may be at risk for negative affective responses from adults, though the relationship between those responses and caregiving behavior remains elusive. PMID:25658199
The affect of tissue depth variation on craniofacial reconstructions.
Starbuck, John M; Ward, Richard E
2007-10-25
We examined the affect of tissue depth variation on the reconstruction of facial form, through the application of the American method, utilizing published tissue depth measurements for emaciated, normal, and obese faces. In this preliminary study, three reconstructions were created on reproductions of the same skull for each set of tissue depth measurements. The resulting morphological variation was measured quantitatively using the anthropometric craniofacial variability index (CVI). This method employs 16 standard craniofacial anthropometric measurements and the results reflect "pattern variation" or facial harmony. We report no appreciable variation in the quantitative measure of the pattern facial form obtained from the three different sets of tissue depths. Facial similarity was assessed qualitatively utilizing surveys of photographs of the three reconstructions. Surveys indicated that subjects frequently perceived the reconstructions as representing different individuals. This disagreement indicates that size of the face may blind observers to similarities in facial form. This research is significant because it illustrates the confounding effect that normal human variation contributes in the successful recognition of individuals from a representational three-dimensional facial reconstruction. Research results suggest that successful identification could be increased if multiple reconstructions were created which reflect a wide range of possible outcomes for facial form. The creation of multiple facial images, from a single skull, will be facilitated as computerized versions of facial reconstruction are further developed and refined.
Pattern of facial palsy in a typical Nigerian specialist hospital.
Lamina, S; Hanif, S
2012-12-01
Data on incidence of facial palsy is generally lacking in Nigeria. To assess six years' incidence of facial palsy in Murtala Muhammed Specialist Hospital (MMSH), Kano, Nigeria. The records of patients diagnosed as facial problems between January 2000 and December 2005 were scrutinized. Data on diagnosis, age, sex, side affected, occupation and causes were obtained. A total number of 698 patients with facial problems were recorded. Five hundred and ninety four (85%) were diagnosed as facial palsy. Out of the diagnosed facial palsy, males (56.2%) had a higher incidence than females; 20-34 years age group (40.3%) had a greater prevalence; the commonest cause of facial palsy was found out to be Idiopathic (39.1%) and was most common among business men (31.6%). Right sided facial palsy (52.2%) was predominant. Incidence of facial palsy was highest in 2003 (25.3%) and decreased from 2004. It was concluded that the incidence of facial palsy was high and Bell's palsy remains the most common causes of facial (nerve) paralysis.
ERIC Educational Resources Information Center
Ray, Arindam; Chakrabarti, Amlan
2016-01-01
Technology Enabled Learning is a cognitive, constructive, systematic, collaborative learning procedure, which transforms teaching-learning pedagogy where role of emotion is very often neglected. Emotion plays significant role in the cognitive process of human being, so the transformation is incomplete without capturing the learner's emotional…
Down Syndrome and Automatic Processing of Familiar and Unfamiliar Emotional Faces
ERIC Educational Resources Information Center
Morales, Guadalupe E.; Lopez, Ernesto O.
2010-01-01
Participants with Down syndrome (DS) were required to participate in a face recognition experiment to recognize familiar (DS faces) and unfamiliar emotional faces (non DS faces), by using an affective priming paradigm. Pairs of emotional facial stimuli were presented (one face after another) with a short Stimulus Onset Asynchrony of 300…
Bansal, Ravi; Liu, Jun; Gerber, Andrew J.; Goh, Suzanne; Posner, Jonathan; Colibazzi, Tiziano; Algermissen, Molly; Chiang, I-Chin; Russell, James A.; Peterson, Bradley S.
2015-01-01
The Affective Circumplex Model holds that emotions can be described as linear combinations of two underlying, independent neurophysiological systems (arousal, valence). Given research suggesting individuals with autism spectrum disorders (ASD) have difficulty processing emotions, we used the circumplex model to compare how individuals with ASD and typically-developing (TD) individuals respond to facial emotions. Participants (51 ASD, 80 TD) rated facial expressions along arousal and valence dimensions; we fitted closed, smooth, 2-dimensional curves to their ratings to examine overall circumplex contours. We modeled individual and group influences on parameters describing curve contours to identify differences in dimensional effects across groups. Significant main effects of diagnosis indicated the ASD-group’ s ratings were constricted for the entire circumplex, suggesting range constriction across all emotions. Findings did not change when covarying for overall intelligence. PMID:24234677
Fetal alcohol spectrum disorders: an overview.
Riley, Edward P; Infante, M Alejandra; Warren, Kenneth R
2011-06-01
When fetal alcohol syndrome (FAS) was initially described, diagnosis was based upon physical parameters including facial anomalies and growth retardation, with evidence of developmental delay or mental deficiency. Forty years of research has shown that FAS lies towards the extreme end of what are now termed fetal alcohol spectrum disorders (FASD). The most profound effects of prenatal alcohol exposure are on the developing brain and the cognitive and behavioral effects that ensue. Alcohol exposure affects brain development via numerous pathways at all stages from neurogenesis to myelination. For example, the same processes that give rise to the facial characteristics of FAS also cause abnormal brain development. Behaviors as diverse as executive functioning to motor control are affected. This special issue of Neuropsychology Review addresses these changes in brain and behavior highlighting the relationship between the two. A diagnostic goal is to recognize FAS as a disorder of brain rather than one of physical characteristics.
Köchel, Angelika; Leutgeb, Verena; Schienle, Anne
2014-04-01
This event-related potential study focused on neural correlates of inhibitory affective control in attention-deficit hyperactivity disorder (ADHD). Sixteen boys with ADHD and 16 healthy boys underwent an emotional Go/NoGo task with pictures of facial expressions from the categories anger, sadness, happiness, and neutral. The participants were instructed to execute or withhold a motor response to specific emotions. Patients relative to controls displayed a severe impairment in response inhibition toward anger cues, which was accompanied by a reduced P300 amplitude (positive voltage deflection about 300 ms after picture onset). The control group showed a P300 differentiation of the affective categories that was absent in the ADHD group. The pronounced anger-processing deficit in ADHD patients might be linked to their interpersonal difficulties and should be addressed in psychotherapy.
Bernat, Edward M; Cadwallader, Meredith; Seo, Dongju; Vizueta, Nathalie; Patrick, Christopher J
2011-01-01
Cognitive control of emotion has been investigated using tasks prompting participants to increase or decrease emotional responding to affective pictures. This study provides a more comprehensive evaluation of responding in this task by including: pleasant and unpleasant pictures, increase and decrease instructions, additional physiological measures, and a fully randomized design. Findings suggest that control efforts did modulate higher-level affective responses indexed by self-reported valence and expressive facial muscles, but not lower-level affective responses indexed by startle blink and heart rate. Similarly, electrocortical measures evidenced expectable affective responses and control-related activity, but no modulation of affective patterns due to the control efforts.
Topolinski, Sascha; Strack, Fritz
2009-02-01
People can intuitively detect whether a word triad has a common remote associate (coherent) or does not have one (incoherent) before and independently of actually retrieving the common associate. The authors argue that semantic coherence increases the processing fluency for coherent triads and that this increased fluency triggers a brief and subtle positive affect, which is the experiential basis of these intuitions. In a series of 11 experiments with 3 different fluency manipulations (figure-ground contrast, repeated exposure, and subliminal visual priming) and 3 different affect inductions (short-timed facial feedback, subliminal facial priming, and affect-laden word triads), high fluency and positive affect independently and additively increased the probability that triads would be judged as coherent, irrespective of actual coherence. The authors could equalize and even reverse coherence judgments (i.e., incoherent triads were judged to be coherent more frequently than were coherent triads). When explicitly instructed, participants were unable to correct their judgments for the influence of affect, although they were aware of the manipulation. The impact of fluency and affect was also generalized to intuitions of visual coherence and intuitions of grammaticality in an artificial grammar learning paradigm. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
ERIC Educational Resources Information Center
Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn
2011-01-01
Can the disadvantages persons with Asperger syndrome frequently experience with reading facially expressed emotions be attributed to a different visual perception, affecting their scanning patterns? Visual search strategies, particularly regarding the importance of information from the eye area, and the ability to recognise facially expressed…
Pan, Ning; Wu, Gui-Hua; Zhang, Ling; Zhao, Ya-Fen; Guan, Han; Xu, Cai-Juan; Jing, Jin; Jin, Yu
2017-03-01
To investigate the features of intelligence development, facial expression recognition ability, and the association between them in children with autism spectrum disorder (ASD). A total of 27 ASD children aged 6-16 years (ASD group, full intelligence quotient >70) and age- and gender-matched normally developed children (control group) were enrolled. Wechsler Intelligence Scale for Children Fourth Edition and Chinese Static Facial Expression Photos were used for intelligence evaluation and facial expression recognition test. Compared with the control group, the ASD group had significantly lower scores of full intelligence quotient, verbal comprehension index, perceptual reasoning index (PRI), processing speed index(PSI), and working memory index (WMI) (P<0.05). The ASD group also had a significantly lower overall accuracy rate of facial expression recognition and significantly lower accuracy rates of the recognition of happy, angry, sad, and frightened expressions than the control group (P<0.05). In the ASD group, the overall accuracy rate of facial expression recognition and the accuracy rates of the recognition of happy and frightened expressions were positively correlated with PRI (r=0.415, 0.455, and 0.393 respectively; P<0.05). The accuracy rate of the recognition of angry expression was positively correlated with WMI (r=0.397; P<0.05). ASD children have delayed intelligence development compared with normally developed children and impaired expression recognition ability. Perceptual reasoning and working memory abilities are positively correlated with expression recognition ability, which suggests that insufficient perceptual reasoning and working memory abilities may be important factors affecting facial expression recognition ability in ASD children.
Developmental Differences in Holistic Interference of Facial Part Recognition
Nakabayashi, Kazuyo; Liu, Chang Hong
2013-01-01
Research has shown that adults’ recognition of a facial part can be disrupted if the part is learnt without a face context but tested in a whole face. This has been interpreted as the holistic interference effect. The present study investigated whether children of 6- and 9–10-year-olds would show a similar effect. Participants were asked to judge whether a probe part was the same as or different from a test part whereby the part was presented either in isolation or in a whole face. The results showed that while all the groups were susceptible to a holistic interference, the youngest group was most severely affected. Contrary to the view that piecemeal processing precedes holistic processing in the cognitive development, our findings demonstrate that holistic processing is already present at 6 years of age. It is the ability to inhibit the influence of holistic information on piecemeal processing that seems to require a longer period of development into at an older and adult age. PMID:24204847
Pistoia, Francesca; Carolei, Antonio; Sacco, Simona; Conson, Massimiliano; Pistarini, Caterina; Cazzulani, Benedetta; Stewart, Janet; Franceschini, Marco; Sarà, Marco
2015-12-15
There is much evidence to suggest that recognizing and sharing emotions with others require a first-hand experience of those emotions in our own body which, in turn, depends on the adequate perception of our own internal state (interoception) through preserved sensory pathways. Here we explored the contribution of interoception to first-hand emotional experiences and to the recognition of others' emotions. For this aim, 10 individuals with sensory deafferentation as a consequence of high spinal cord injury (SCI; five males and five females; mean age, 48 ± 14.8 years) and 20 healthy subjects matched for age, sex, and education were included in the study. Recognition of facial expressions and judgment of emotionally evocative scenes were investigated in both groups using the Ekman and Friesen set of Pictures of Facial Affect and the International Affective Picture System. A two-way mixed analysis of variance and post hoc comparisons were used to test differences among emotions and groups. Compared with healthy subjects, individuals with SCI, when asked to judge emotionally evocative scenes, had difficulties in judging their own emotional response to complex scenes eliciting fear and anger, while they were able to recognize the same emotions when conveyed by facial expressions. Our findings endorse a simulative view of emotional processing according to which the proper perception of our own internal state (interoception), through preserved sensory pathways, is crucial for first-hand experiences of the more primordial emotions, such as fear and anger.
Effect of an observer's presence on facial behavior during dyadic communication.
Yamamoto, K; Suzuki, N
2012-06-01
In everyday life, people communicate not only with another person but also in front of other people. How do people behave during communication when observed by others? Effects of an observer (presence vs absence) and interpersonal relationship (friends vs strangers vs alone) on facial behavior were examined. Participants viewed film clips that elicited positive affect (film presentation) and discussed their impressions about the clips (conversation). Participants rated their subjective emotions and social motives. Durations of smiles, gazes, and utterances of each participant were coded. The presence of an observer did not affect facial behavior during the film presentation, but did affect gazes during conversation. Whereas the presence of an observer seemed to facilitate affiliation in pairs of strangers, communication between friends was exclusive and not affected by an observer.
Extraction and representation of common feature from uncertain facial expressions with cloud model.
Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing
2017-12-01
Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.
A shape-based account for holistic face processing.
Zhao, Mintao; Bülthoff, Heinrich H; Bülthoff, Isabelle
2016-04-01
Faces are processed holistically, so selective attention to 1 face part without any influence of the others often fails. In this study, 3 experiments investigated what type of facial information (shape or surface) underlies holistic face processing and whether generalization of holistic processing to nonexperienced faces requires extensive discrimination experience. Results show that facial shape information alone is sufficient to elicit the composite face effect (CFE), 1 of the most convincing demonstrations of holistic processing, whereas facial surface information is unnecessary (Experiment 1). The CFE is eliminated when faces differ only in surface but not shape information, suggesting that variation of facial shape information is necessary to observe holistic face processing (Experiment 2). Removing 3-dimensional (3D) facial shape information also eliminates the CFE, indicating the necessity of 3D shape information for holistic face processing (Experiment 3). Moreover, participants show similar holistic processing for faces with and without extensive discrimination experience (i.e., own- and other-race faces), suggesting that generalization of holistic processing to nonexperienced faces requires facial shape information, but does not necessarily require further individuation experience. These results provide compelling evidence that facial shape information underlies holistic face processing. This shape-based account not only offers a consistent explanation for previous studies of holistic face processing, but also suggests a new ground-in addition to expertise-for the generalization of holistic processing to different types of faces and to nonface objects. (c) 2016 APA, all rights reserved).
Effects of the BDNF Val66Met polymorphism on neural responses to facial emotion.
Mukherjee, Prerona; Whalley, Heather C; McKirdy, James W; McIntosh, Andrew M; Johnstone, Eve C; Lawrie, Stephen M; Hall, Jeremy
2011-03-31
The brain derived neurotrophic factor (BDNF) Val66Met polymorphism has been associated with affective disorders, but its role in emotion processing has not been fully established. Due to the clinically heterogeneous nature of these disorders, studying the effect of genetic variation in the BDNF gene on a common attribute such as fear processing may elucidate how the BDNF Val66Met polymorphism impacts brain function. Here we use functional magnetic resonance imaging examine the effect of the BDNF Val66Met genotype on neural activity for fear processing. Forty healthy participants performed an implicit fear task during scanning, where subjects made gender judgments from facial images with neutral or fearful emotion. Subjects were tested for facial emotion recognition post-scan. Functional connectivity was investigated using psycho-physiological interactions. Subjects were genotyped for the BDNF Val66Met polymorphism and the measures compared between genotype groups. Met carriers showed overactivation in the anterior cingulate cortex (ACC), brainstem and insula bilaterally for fear processing, along with reduced functional connectivity from the ACC to the left hippocampus, and impaired fear recognition ability. The results show that during fear processing, Met allele carriers show an increased neural response in regions previously implicated in mediating autonomic arousal. Further, the Met carriers show decreased functional connectivity with the hippocampus, which may reflect differential retrieval of emotional associations. Together, these effects show significant differences in the neural substrate for fear processing with genetic variation in BDNF. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Evidence of a Shift from Featural to Configural Face Processing in Infancy
ERIC Educational Resources Information Center
Schwarzer, Gudrun; Zauner, Nicola; Jovanovic, Bianca
2007-01-01
Two experiments examined whether 4-, 6-, and 10-month-old infants process natural looking faces by feature, i.e. processing internal facial features independently of the facial context or holistically by processing the features in conjunction with the facial context. Infants were habituated to two faces and looking time was measured. After…
Mike, Andrea; Strammer, Erzsebet; Aradi, Mihaly; Orsi, Gergely; Perlaki, Gabor; Hajnal, Andras; Sandor, Janos; Banati, Miklos; Illes, Eniko; Zaitsev, Alexander; Herold, Robert; Guttmann, Charles R G; Illes, Zsolt
2013-01-01
Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind) may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus). Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed), processing of emotions (right entorhinal cortex) and socially relevant information (left temporal pole). Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.
Emotion Recognition Ability: A Multimethod-Multitrait Study.
ERIC Educational Resources Information Center
Gaines, Margie; And Others
A common paradigm in measuring the ability to recognize facial expressions of emotion is to present photographs of facial expressions and to ask subjects to identify the emotion. The Affect Blend Test (ABT) uses this method of assessment and is scored for accuracy on specific affects as well as total accuracy. Another method of measuring affect…
Poor Facial Affect Recognition among Boys with Duchenne Muscular Dystrophy
ERIC Educational Resources Information Center
Hinton, V. J.; Fee, R. J.; De Vivo, D. C.; Goldstein, E.
2007-01-01
Children with Duchenne or Becker muscular dystrophy (MD) have delayed language and poor social skills and some meet criteria for Pervasive Developmental Disorder, yet they are identified by molecular, rather than behavioral, characteristics. To determine whether comprehension of facial affect is compromised in boys with MD, children were given a…
Facial emotion recognition in Parkinson's disease: A review and new hypotheses
Vérin, Marc; Sauleau, Paul; Grandjean, Didier
2018-01-01
Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661
Hannah, Beverly; Wang, Yue; Jongman, Allard; Sereno, Joan A; Cao, Jiguo; Nie, Yunlong
2017-01-01
Speech perception involves multiple input modalities. Research has indicated that perceivers establish cross-modal associations between auditory and visuospatial events to aid perception. Such intermodal relations can be particularly beneficial for speech development and learning, where infants and non-native perceivers need additional resources to acquire and process new sounds. This study examines how facial articulatory cues and co-speech hand gestures mimicking pitch contours in space affect non-native Mandarin tone perception. Native English as well as Mandarin perceivers identified tones embedded in noise with either congruent or incongruent Auditory-Facial (AF) and Auditory-FacialGestural (AFG) inputs. Native Mandarin results showed the expected ceiling-level performance in the congruent AF and AFG conditions. In the incongruent conditions, while AF identification was primarily auditory-based, AFG identification was partially based on gestures, demonstrating the use of gestures as valid cues in tone identification. The English perceivers' performance was poor in the congruent AF condition, but improved significantly in AFG. While the incongruent AF identification showed some reliance on facial information, incongruent AFG identification relied more on gestural than auditory-facial information. These results indicate positive effects of facial and especially gestural input on non-native tone perception, suggesting that cross-modal (visuospatial) resources can be recruited to aid auditory perception when phonetic demands are high. The current findings may inform patterns of tone acquisition and development, suggesting how multi-modal speech enhancement principles may be applied to facilitate speech learning.
Hannah, Beverly; Wang, Yue; Jongman, Allard; Sereno, Joan A.; Cao, Jiguo; Nie, Yunlong
2017-01-01
Speech perception involves multiple input modalities. Research has indicated that perceivers establish cross-modal associations between auditory and visuospatial events to aid perception. Such intermodal relations can be particularly beneficial for speech development and learning, where infants and non-native perceivers need additional resources to acquire and process new sounds. This study examines how facial articulatory cues and co-speech hand gestures mimicking pitch contours in space affect non-native Mandarin tone perception. Native English as well as Mandarin perceivers identified tones embedded in noise with either congruent or incongruent Auditory-Facial (AF) and Auditory-FacialGestural (AFG) inputs. Native Mandarin results showed the expected ceiling-level performance in the congruent AF and AFG conditions. In the incongruent conditions, while AF identification was primarily auditory-based, AFG identification was partially based on gestures, demonstrating the use of gestures as valid cues in tone identification. The English perceivers’ performance was poor in the congruent AF condition, but improved significantly in AFG. While the incongruent AF identification showed some reliance on facial information, incongruent AFG identification relied more on gestural than auditory-facial information. These results indicate positive effects of facial and especially gestural input on non-native tone perception, suggesting that cross-modal (visuospatial) resources can be recruited to aid auditory perception when phonetic demands are high. The current findings may inform patterns of tone acquisition and development, suggesting how multi-modal speech enhancement principles may be applied to facilitate speech learning. PMID:29255435
Ventura, Joseph; Wood, Rachel C.; Jimenez, Amy M.; Hellemann, Gerhard S.
2014-01-01
Background In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? Methods A meta-analysis of 102 studies (combined n = 4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Results Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r = .51). In addition, the relationship between FR and EP through voice prosody (r = .58) is as strong as the relationship between FR and EP based on facial stimuli (r = .53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality – facial stimuli and voice prosody. Discussion The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. PMID:24268469
Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S
2013-12-01
In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.
Lui, Joyce H L; Barry, Christopher T; Sacco, Donald F
2016-09-01
Although empathy deficits are thought to be associated with callous-unemotional (CU) traits, findings remain equivocal and little is known about what specific abilities may underlie these purported deficits. Affective perspective-taking (APT) and facial emotion recognition may be implicated, given their independent associations with both empathy and CU traits. The current study examined how CU traits relate to cognitive and affective empathy and whether APT and facial emotion recognition mediate these relations. Participants were 103 adolescents (70 males) aged 16-18 attending a residential programme. CU traits were negatively associated with cognitive and affective empathy to a similar degree. The association between CU traits and affective empathy was partially mediated by APT. Results suggest that assessing mechanisms that may underlie empathic deficits, such as perspective-taking, may be important for youth with CU traits and may inform targets of intervention.
Fu, Cynthia H Y; Williams, Steven C R; Cleare, Anthony J; Brammer, Michael J; Walsh, Nicholas D; Kim, Jieun; Andrew, Chris M; Pich, Emilio Merlo; Williams, Pauline M; Reed, Laurence J; Mitterschiffthaler, Martina T; Suckling, John; Bullmore, Edward T
2004-09-01
Depression is associated with interpersonal difficulties related to abnormalities in affective facial processing. To map brain systems activated by sad facial affect processing in patients with depression and to identify brain functional correlates of antidepressant treatment and symptomatic response. Two groups underwent scanning twice using functional magnetic resonance imaging (fMRI) during an 8-week period. The event-related fMRI paradigm entailed incidental affect recognition of facial stimuli morphed to express discriminable intensities of sadness. Participants were recruited by advertisement from the local population; depressed subjects were treated as outpatients. We matched 19 medication-free, acutely symptomatic patients satisfying DSM-IV criteria for unipolar major depressive disorder by age, sex, and IQ with 19 healthy volunteers. Intervention After the baseline assessment, patients received fluoxetine hydrochloride, 20 mg/d, for 8 weeks. Average activation (capacity) and differential response to variable affective intensity (dynamic range) were estimated in each fMRI time series. We used analysis of variance to identify brain regions that demonstrated a main effect of group (depressed vs healthy subjects) and a group x time interaction (attributable to antidepressant treatment). Change in brain activation associated with reduction of depressive symptoms in the patient group was identified by means of regression analysis. Permutation tests were used for inference. Over time, depressed subjects showed reduced capacity for activation in the left amygdala, ventral striatum, and frontoparietal cortex and a negatively correlated increase of dynamic range in the prefrontal cortex. Symptomatic improvement was associated with reduction of dynamic range in the pregenual cingulate cortex, ventral striatum, and cerebellum. Antidepressant treatment reduces left limbic, subcortical, and neocortical capacity for activation in depressed subjects and increases the dynamic range of the left prefrontal cortex. Changes in anterior cingulate function associated with symptomatic improvement indicate that fMRI may be a useful surrogate marker of antidepressant treatment response.
Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
2016-11-01
Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.
[Prosopagnosia and facial expression recognition].
Koyama, Shinichi
2014-04-01
This paper reviews clinical neuropsychological studies that have indicated that the recognition of a person's identity and the recognition of facial expressions are processed by different cortical and subcortical areas of the brain. The fusiform gyrus, especially the right fusiform gyrus, plays an important role in the recognition of identity. The superior temporal sulcus, amygdala, and medial frontal cortex play important roles in facial-expression recognition. Both facial recognition and facial-expression recognition are highly intellectual processes that involve several regions of the brain.
ERIC Educational Resources Information Center
Rozga, Agata; King, Tricia Z.; Vuduc, Richard W.; Robins, Diana L.
2013-01-01
We examined facial electromyography (fEMG) activity to dynamic, audio-visual emotional displays in individuals with autism spectrum disorders (ASD) and typically developing (TD) individuals. Participants viewed clips of happy, angry, and fearful displays that contained both facial expression and affective prosody while surface electrodes measured…
Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio
2014-01-01
Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses. PMID:25337916
Direct effects of diazepam on emotional processing in healthy volunteers
Murphy, S. E.; Downham, C.; Cowen, P. J.
2008-01-01
Rationale Pharmacological agents used in the treatment of anxiety have been reported to decrease threat relevant processing in patients and healthy controls, suggesting a potentially relevant mechanism of action. However, the effects of the anxiolytic diazepam have typically been examined at sedative doses, which do not allow the direct actions on emotional processing to be fully separated from global effects of the drug on cognition and alertness. Objectives The aim of this study was to investigate the effect of a lower, but still clinically effective, dose of diazepam on emotional processing in healthy volunteers. Materials and methods Twenty-four participants were randomised to receive a single dose of diazepam (5 mg) or placebo. Sixty minutes later, participants completed a battery of psychological tests, including measures of non-emotional cognitive performance (reaction time and sustained attention) and emotional processing (affective modulation of the startle reflex, attentional dot probe, facial expression recognition, and emotional memory). Mood and subjective experience were also measured. Results Diazepam significantly modulated attentional vigilance to masked emotional faces and significantly decreased overall startle reactivity. Diazepam did not significantly affect mood, alertness, response times, facial expression recognition, or sustained attention. Conclusions At non-sedating doses, diazepam produces effects on attentional vigilance and startle responsivity that are consistent with its anxiolytic action. This may be an underlying mechanism through which benzodiazepines exert their therapeutic effects in clinical anxiety. PMID:18581100
Pomahac, Bohdan; Aflaki, Pejman; Nelson, Charles; Balas, Benjamin
2010-05-01
Partial facial allotransplantation is an emerging option in reconstruction of central facial defects, providing function and aesthetic appearance. Ethical debate partly stems from uncertainty surrounding identity aspects of the procedure. There is no objective evidence regarding the effect of donors' transplanted facial structures on appearance change of the recipients and its influence on facial recognition of donors and recipients. Full-face frontal view color photographs of 100 volunteers were taken at a distance of 150 cm with a digital camera (Nikon/DX80). Photographs were taken in front of a blue background, and with a neutral facial expression. Using image-editing software (Adobe-Photoshop-CS3), central facial transplantation was performed between participants. Twenty observers performed a familiar 'facial recognition task', to identify 40 post-transplant composite faces presented individually on the screen at a viewing distance of 60 cm, with an exposure time of 5s. Each composite face comprised of a familiar and an unfamiliar face to the observers. Trials were done with and without external facial features (head contour, hair and ears). Two variables were defined: 'Appearance Transfer' refers to transfer of donor's appearance to the recipient. 'Appearance Persistence' deals with the extent of recipient's appearance change post-transplantation. A t-test was run to determine if the rates of Appearance Transfer differed from Appearance Persistence. Average Appearance Transfer rate (2.6%) was significantly lower than Appearance Persistence rate (66%) (P<0.001), indicating that donor's appearance transfer to the recipient is negligible, whereas recipients will be identified the majority of the time. External facial features were important in facial recognition of recipients, evidenced by a significant rise in Appearance Persistence from 19% in the absence of external features to 66% when those features were present (P<0.01). This study may be helpful in the informed consent process of prospective recipients. It is beneficial for education of donors families and is expected to positively affect their decision to consent for facial tissue donation. Copyright (c) 2009 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Affective Computing and the Impact of Gender and Age
Rukavina, Stefanie; Gruss, Sascha; Hoffmann, Holger; Tan, Jun-Wen; Walter, Steffen; Traue, Harald C.
2016-01-01
Affective computing aims at the detection of users’ mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial expressions and/or psychobiology. Over the past years, one major approach was to identify the best features for each signal using different classification methods. Although this is of high priority, other subject-specific variables should not be neglected. In our study, we analyzed the effect of gender, age, personality and gender roles on the extracted psychobiological features (derived from skin conductance level, facial electromyography and heart rate variability) as well as the influence on the classification results. In an experimental human-computer interaction, five different affective states with picture material from the International Affective Picture System and ULM pictures were induced. A total of 127 subjects participated in the study. Among all potentially influencing variables (gender has been reported to be influential), age was the only variable that correlated significantly with psychobiological responses. In summary, the conducted classification processes resulted in 20% classification accuracy differences according to age and gender, especially when comparing the neutral condition with four other affective states. We suggest taking age and gender specifically into account for future studies in affective computing, as these may lead to an improvement of emotion recognition accuracy. PMID:26939129
Hoaken, Peter N S; Allaby, David B; Earle, Jeff
2007-01-01
Violence is a social problem that carries enormous costs; however, our understanding of its etiology is quite limited. A large body of research exists, which suggests a relationship between abnormalities of the frontal lobe and aggression; as a result, many researchers have implicated deficits in so-called "executive function" as an antecedent to aggressive behaviour. Another possibility is that violence may be related to problems interpreting facial expressions of emotion, a deficit associated with many forms of psychopathology, and an ability linked to the prefrontal cortex. The current study investigated performance on measures of executive function and on a facial-affect recognition task in 20 violent offenders, 20 non-violent offenders, and 20 controls. In support of our hypotheses, both offender groups performed significantly more poorly on measures of executive function relative to controls. In addition, violent offenders were significantly poorer on the facial-affect recognition task than either of the other two groups. Interestingly, scores on these measures were significantly correlated, with executive deficits associated with difficulties accurately interpreting facial affect. The implications of these results are discussed in terms of a broader understanding of violent behaviour. Copyright 2007 Wiley-Liss, Inc.
Unattractive infant faces elicit negative affect from adults.
Schein, Stevie S; Langlois, Judith H
2015-02-01
We examined the relationship between infant attractiveness and adult affect by investigating whether differing levels of infant facial attractiveness elicit facial muscle movement correlated with positive and negative affect from adults (N=87) using electromyography. Unattractive infant faces evoked significantly more corrugator supercilii and levator labii superioris movement (physiological correlates of negative affect) than attractive infant faces. These results suggest that unattractive infants may be at risk for negative affective responses from adults, though the relationship between those responses and caregiving behavior remains elusive. Copyright © 2015 Elsevier Inc. All rights reserved.
The effects of postnatal maternal depression and anxiety on the processing of infant faces
Arteche, Adriane; Joormann, Jutta; Harvey, Allison; Craske, Michelle; Gotlib, Ian H.; Lehtonen, Annukka; Counsell, Nicholas; Stein, Alan
2011-01-01
Background Postnatally depressed mothers have difficulties responding appropriately to their infants. The quality of the mother–child relationship depends on a mother's ability to respond to her infant's cues, which are largely non-verbal. Therefore, it is likely that difficulties in a mother's appraisal of her infants' facial expressions will affect the quality of mother–infant interaction. This study aimed to investigate the effects of postnatal depression and anxiety on the processing of infants' facial expressions. Method A total of 89 mothers, 34 with Generalised Anxiety Disorder, 21 with Major Depressive Disorder, and 34 controls, completed a ‘morphed infants’ faces task when their children were between 10 and 18 months. Results Overall, mothers were more likely to identify happy faces accurately and at lower intensity than sad faces. Depressed compared to control participants, however, were less likely to accurately identify happy infant faces. Interestingly, mothers with GAD tended to identify happy faces at a lower intensity than controls. There were no differences between the groups in relation to sad faces. Limitations Our sample was relatively small and further research is needed to investigate the links between mothers' perceptions of infant expressions and both maternal responsiveness and later measures of child development. Conclusion Our findings have potential clinical implications as the difficulties in the processing of positive facial expressions in depression may lead to less maternal responsiveness to positive affect in the offspring and may diminish the quality of the mother–child interactions. Results for participants with GAD are consistent with the literature demonstrating that persons with GAD are intolerant of uncertainty and seek reassurance due to their worries. PMID:21641652
The effects of postnatal maternal depression and anxiety on the processing of infant faces.
Arteche, Adriane; Joormann, Jutta; Harvey, Allison; Craske, Michelle; Gotlib, Ian H; Lehtonen, Annukka; Counsell, Nicholas; Stein, Alan
2011-09-01
Postnatally depressed mothers have difficulties responding appropriately to their infants. The quality of the mother-child relationship depends on a mother's ability to respond to her infant's cues, which are largely non-verbal. Therefore, it is likely that difficulties in a mother's appraisal of her infants' facial expressions will affect the quality of mother-infant interaction. This study aimed to investigate the effects of postnatal depression and anxiety on the processing of infants' facial expressions. A total of 89 mothers, 34 with Generalised Anxiety Disorder, 21 with Major Depressive Disorder, and 34 controls, completed a 'morphed infants' faces task when their children were between 10 and 18 months. Overall, mothers were more likely to identify happy faces accurately and at lower intensity than sad faces. Depressed compared to control participants, however, were less likely to accurately identify happy infant faces. Interestingly, mothers with GAD tended to identify happy faces at a lower intensity than controls. There were no differences between the groups in relation to sad faces. Our sample was relatively small and further research is needed to investigate the links between mothers' perceptions of infant expressions and both maternal responsiveness and later measures of child development. Our findings have potential clinical implications as the difficulties in the processing of positive facial expressions in depression may lead to less maternal responsiveness to positive affect in the offspring and may diminish the quality of the mother-child interactions. Results for participants with GAD are consistent with the literature demonstrating that persons with GAD are intolerant of uncertainty and seek reassurance due to their worries. Copyright © 2011 Elsevier B.V. All rights reserved.
Functional MRI of facial emotion processing in left temporal lobe epilepsy.
Szaflarski, Jerzy P; Allendorfer, Jane B; Heyse, Heidi; Mendoza, Lucy; Szaflarski, Basia A; Cohen, Nancy
2014-03-01
Temporal lobe epilepsy (TLE) may negatively affect the ability to recognize emotions. This study aimed to determine the cortical correlates of facial emotion processing (happy, sad, fearful, and neutral) in patients with well-characterized left TLE (LTLE) and to examine the effect of seizure control on emotion processing. We enrolled 34 consecutive patients with LTLE and 30 matched healthy control (HC) subjects. Participants underwent functional MRI (fMRI) with an event-related facial emotion recognition task. The seizures of seventeen patients were controlled (no seizure in at least 3months; LTLE-sz), and 17 continued to experience frequent seizures (LTLE+sz). Mood was assessed with the Beck Depression Inventory (BDI) and the Profile of Mood States (POMS). There were no differences in demographic characteristics and measures of mood between HC subjects and patients with LTLE. In patients with LTLE, fMRI showed decreased blood oxygenation level dependent (BOLD) signal in the hippocampus/parahippocampus and cerebellum in processing of happy faces and increased BOLD signal in occipital regions in response to fearful faces. Comparison of groups with LTLE+sz and LTLE-sz showed worse BDI and POMS scores in LTLE+sz (all p<0.05) except for POMS tension/anxiety (p=0.067). Functional MRI revealed increased BOLD signal in patients with LTLE+sz in the left precuneus and left parahippocampus for "fearful" faces and in the left periarcheocortex for "neutral" faces. There was a correlation between the fMRI and Total Mood Disturbance in the left precuneus in LTLE-sz (p=0.019) and in LTLE+sz (p=0.018). Overall, LTLE appears to have a relatively minor effect on the cortical underpinnings of facial emotion processing, while the effect of seizure state (controlled vs. not controlled) is more pronounced, indicating a significant relationship between seizure control and emotion processing. Copyright © 2014 Elsevier Inc. All rights reserved.
Ferrari, Pier Francesco; Barbot, Anna; Bianchi, Bernardo; Ferri, Andrea; Garofalo, Gioacchino; Bruno, Nicola; Coudé, Gino; Bertolini, Chiara; Ardizzi, Martina; Nicolini, Ylenia; Belluardo, Mauro; Stefani, Elisa De
2017-05-01
Studies of the last twenty years on the motor and premotor cortices of primates demonstrated that the motor system is involved in the control and initiation of movements, and in higher cognitive processes, such as action understanding, imitation, and empathy. Mirror neurons are only one example of such theoretical shift. Their properties demonstrate that motor and sensory processing are coupled in the brain. Such knowledge has been also central for designing new neurorehabilitative therapies for patients suffering from brain injuries and consequent motor deficits. Moebius Syndrome patients, for example, are incapable of moving their facial muscles, which are fundamental for affective communication. These patients face an important challenge after having undergone a corrective surgery: reanimating the transplanted muscles to achieve a voluntarily control of smiling. We propose two new complementary rehabilitative approaches on MBS patients based on observation/imitation therapy (Facial Imitation Therapy, FIT) and on hand-mouth motor synergies (Synergistic Activity Therapy, SAT). Preliminary results show that our intervention protocol is a promising approach for neurorehabilitation of patients with facial palsy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dissociable roles of internal feelings and face recognition ability in facial expression decoding.
Zhang, Lin; Song, Yiying; Liu, Ling; Liu, Jia
2016-05-15
The problem of emotion recognition has been tackled by researchers in both affective computing and cognitive neuroscience. While affective computing relies on analyzing visual features from facial expressions, it has been proposed that humans recognize emotions by internally simulating the emotional states conveyed by others' expressions, in addition to perceptual analysis of facial features. Here we investigated whether and how our internal feelings contributed to the ability to decode facial expressions. In two independent large samples of participants, we observed that individuals who generally experienced richer internal feelings exhibited a higher ability to decode facial expressions, and the contribution of internal feelings was independent of face recognition ability. Further, using voxel-based morphometry, we found that the gray matter volume (GMV) of bilateral superior temporal sulcus (STS) and the right inferior parietal lobule was associated with facial expression decoding through the mediating effect of internal feelings, while the GMV of bilateral STS, precuneus, and the right central opercular cortex contributed to facial expression decoding through the mediating effect of face recognition ability. In addition, the clusters in bilateral STS involved in the two components were neighboring yet separate. Our results may provide clues about the mechanism by which internal feelings, in addition to face recognition ability, serve as an important instrument for humans in facial expression decoding. Copyright © 2016 Elsevier Inc. All rights reserved.
Cues of fatigue: effects of sleep deprivation on facial appearance.
Sundelin, Tina; Lekander, Mats; Kecklund, Göran; Van Someren, Eus J W; Olsson, Andreas; Axelsson, John
2013-09-01
To investigate the facial cues by which one recognizes that someone is sleep deprived versus not sleep deprived. Experimental laboratory study. Karolinska Institutet, Stockholm, Sweden. Forty observers (20 women, mean age 25 ± 5 y) rated 20 facial photographs with respect to fatigue, 10 facial cues, and sadness. The stimulus material consisted of 10 individuals (five women) photographed at 14:30 after normal sleep and after 31 h of sleep deprivation following a night with 5 h of sleep. Ratings of fatigue, fatigue-related cues, and sadness in facial photographs. The faces of sleep deprived individuals were perceived as having more hanging eyelids, redder eyes, more swollen eyes, darker circles under the eyes, paler skin, more wrinkles/fine lines, and more droopy corners of the mouth (effects ranging from b = +3 ± 1 to b = +15 ± 1 mm on 100-mm visual analog scales, P < 0.01). The ratings of fatigue were related to glazed eyes and to all the cues affected by sleep deprivation (P < 0.01). Ratings of rash/eczema or tense lips were not significantly affected by sleep deprivation, nor associated with judgements of fatigue. In addition, sleep-deprived individuals looked sadder than after normal sleep, and sadness was related to looking fatigued (P < 0.01). The results show that sleep deprivation affects features relating to the eyes, mouth, and skin, and that these features function as cues of sleep loss to other people. Because these facial regions are important in the communication between humans, facial cues of sleep deprivation and fatigue may carry social consequences for the sleep deprived individual in everyday life.
Yalcin-Siedentopf, Nursen; Hoertnagl, Christine M; Biedermann, Falko; Baumgartner, Susanne; Deisenhammer, Eberhard A; Hausmann, Armand; Kaufmann, Alexandra; Kemmler, Georg; Mühlbacher, Moritz; Rauch, Anna-Sophia; Fleischhacker, W Wolfgang; Hofer, Alex
2014-02-01
Both schizophrenia and bipolar disorder (BD) have consistently been associated with deficits in facial affect recognition (FAR). These impairments have been related to various aspects of social competence and functioning and are relatively stable over time. However, individuals in remission may outperform patients experiencing an acute phase of the disorders. The present study directly contrasted FAR in symptomatically remitted patients with schizophrenia or BD and healthy volunteers and investigated its relationship with patients' outcomes. Compared to healthy control subjects, schizophrenia patients were impaired in the recognition of angry, disgusted, sad and happy facial expressions, while BD patients showed deficits only in the recognition of disgusted and happy facial expressions. When directly comparing the two patient groups individuals suffering from BD outperformed those with schizophrenia in the recognition of expressions depicting anger. There was no significant association between affect recognition abilities and symptomatic or psychosocial outcomes in schizophrenia patients. Among BD patients, relatively higher depression scores were associated with impairments in both the identification of happy faces and psychosocial functioning. Overall, our findings indicate that during periods of symptomatic remission the recognition of facial affect may be less impaired in patients with BD than in those suffering from schizophrenia. However, in the psychosocial context BD patients seem to be more sensitive to residual symptomatology. Copyright © 2013 Elsevier B.V. All rights reserved.
Heller, Aaron S.; Greischar, Lawrence L; Honor, Ann; Anderle, Michael J; Davidson, Richard J.
2011-01-01
The development of functional neuroimaging of emotion holds the promise to enhance our understanding of the biological bases of affect and improve our knowledge of psychiatric diseases. However, up to this point, researchers have been unable to objectively, continuously and unobtrusively measure the intensity and dynamics of affect concurrently with functional magnetic resonance imaging (fMRI). This has hindered the development and generalizability of our field. Facial electromyography (EMG) is an objective, reliable, valid, sensitive, and unobtrusive measure of emotion. Here, we report the successful development of a method for simultaneously acquiring fMRI and facial EMG. The ability to simultaneously acquire brain activity and facial physiology will allow affective neuroscientists to address theoretical, psychiatric, and individual difference questions in a more rigorous and generalizable way. PMID:21742043
Contemporary solutions for the treatment of facial nerve paralysis.
Garcia, Ryan M; Hadlock, Tessa A; Klebuc, Michael J; Simpson, Roger L; Zenn, Michael R; Marcus, Jeffrey R
2015-06-01
After reviewing this article, the participant should be able to: 1. Understand the most modern indications and technique for neurotization, including masseter-to-facial nerve transfer (fifth-to-seventh cranial nerve transfer). 2. Contrast the advantages and limitations associated with contiguous muscle transfers and free-muscle transfers for facial reanimation. 3. Understand the indications for a two-stage and one-stage free gracilis muscle transfer for facial reanimation. 4. Apply nonsurgical adjuvant treatments for acute facial nerve paralysis. Facial expression is a complex neuromotor and psychomotor process that is disrupted in patients with facial paralysis breaking the link between emotion and physical expression. Contemporary reconstructive options are being implemented in patients with facial paralysis. While static procedures provide facial symmetry at rest, true 'facial reanimation' requires restoration of facial movement. Contemporary treatment options include neurotization procedures (a new motor nerve is used to restore innervation to a viable muscle), contiguous regional muscle transfer (most commonly temporalis muscle transfer), microsurgical free muscle transfer, and nonsurgical adjuvants used to balance facial symmetry. Each approach has advantages and disadvantages along with ongoing controversies and should be individualized for each patient. Treatments for patients with facial paralysis continue to evolve in order to restore the complex psychomotor process of facial expression.
Conson, Massimiliano; Errico, Domenico; Mazzarella, Elisabetta; Giordano, Marianna; Grossi, Dario; Trojano, Luigi
2015-01-01
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person's visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants' tendency to adopt another's point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males' responses to threatening faces whereas it interferes with the ability to adopt another's viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.
Sex-related differences in behavioral and amygdalar responses to compound facial threat cues.
Im, Hee Yeon; Adams, Reginald B; Cushing, Cody A; Boshyan, Jasmine; Ward, Noreen; Kveraga, Kestutis
2018-03-08
During face perception, we integrate facial expression and eye gaze to take advantage of their shared signals. For example, fear with averted gaze provides a congruent avoidance cue, signaling both threat presence and its location, whereas fear with direct gaze sends an incongruent cue, leaving threat location ambiguous. It has been proposed that the processing of different combinations of threat cues is mediated by dual processing routes: reflexive processing via magnocellular (M) pathway and reflective processing via parvocellular (P) pathway. Because growing evidence has identified a variety of sex differences in emotional perception, here we also investigated how M and P processing of fear and eye gaze might be modulated by observer's sex, focusing on the amygdala, a structure important to threat perception and affective appraisal. We adjusted luminance and color of face stimuli to selectively engage M or P processing and asked observers to identify emotion of the face. Female observers showed more accurate behavioral responses to faces with averted gaze and greater left amygdala reactivity both to fearful and neutral faces. Conversely, males showed greater right amygdala activation only for M-biased averted-gaze fear faces. In addition to functional reactivity differences, females had proportionately greater bilateral amygdala volumes, which positively correlated with behavioral accuracy for M-biased fear. Conversely, in males only the right amygdala volume was positively correlated with accuracy for M-biased fear faces. Our findings suggest that M and P processing of facial threat cues is modulated by functional and structural differences in the amygdalae associated with observer's sex. © 2018 Wiley Periodicals, Inc.
Responsibility and the sense of agency enhance empathy for pain
Lepron, Evelyne; Causse, Michaël; Farrer, Chlöé
2015-01-01
Being held responsible for our actions strongly determines our moral judgements and decisions. This study examined whether responsibility also influences our affective reaction to others' emotions. We conducted two experiments in order to assess the effect of responsibility and of a sense of agency (the conscious feeling of controlling an action) on the empathic response to pain. In both experiments, participants were presented with video clips showing an actor's facial expression of pain of varying intensity. The empathic response was assessed with behavioural (pain intensity estimation from facial expressions and unpleasantness for the observer ratings) and electrophysiological measures (facial electromyography). Experiment 1 showed enhanced empathic response (increased unpleasantness for the observer and facial electromyography responses) as participants' degree of responsibility for the actor's pain increased. This effect was mainly accounted for by the decisional component of responsibility (compared with the execution component). In addition, experiment 2 found that participants' unpleasantness rating also increased when they had a sense of agency over the pain, while controlling for decision and execution processes. The findings suggest that increased empathy induced by responsibility and a sense of agency may play a role in regulating our moral conduct. PMID:25473014
Behavioral and neural representation of emotional facial expressions across the lifespan
Somerville, Leah H.; Fani, Negar; McClure-Tone, Erin B.
2011-01-01
Humans’ experience of emotion and comprehension of affective cues varies substantially across the lifespan. Work in cognitive and affective neuroscience has begun to characterize behavioral and neural responses to emotional cues that systematically change with age. This review examines work to date characterizing the maturation of facial expression comprehension, and dynamic changes in amygdala recruitment from early childhood through late adulthood while viewing facial expressions of emotion. Recent neuroimaging work has tested amygdala and prefrontal engagement in experimental paradigms mimicking real aspects of social interactions, which we highlight briefly, along with considerations for future research. PMID:21516541
Jiang, Yi; Shannon, Robert W; Vizueta, Nathalie; Bernat, Edward M; Patrick, Christopher J; He, Sheng
2009-02-01
The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.
Maguinness, Corrina; Newell, Fiona N
2015-04-01
There is growing evidence to suggest that facial motion is an important cue for face recognition. However, it is poorly understood whether motion is integrated with facial form information or whether it provides an independent cue to identity. To provide further insight into this issue, we compared the effect of motion on face perception in two developmental prosopagnosics and age-matched controls. Participants first learned faces presented dynamically (video), or in a sequence of static images, in which rigid (viewpoint) or non-rigid (expression) changes occurred. Immediately following learning, participants were required to match a static face image to the learned face. Test face images varied by viewpoint (Experiment 1) or expression (Experiment 2) and were learned or novel face images. We found similar performance across prosopagnosics and controls in matching facial identity across changes in viewpoint when the learned face was shown moving in a rigid manner. However, non-rigid motion interfered with face matching across changes in expression in both individuals with prosopagnosia compared to the performance of control participants. In contrast, non-rigid motion did not differentially affect the matching of facial expressions across changes in identity for either prosopagnosics (Experiment 3). Our results suggest that whilst the processing of rigid motion information of a face may be preserved in developmental prosopagnosia, non-rigid motion can specifically interfere with the representation of structural face information. Taken together, these results suggest that both form and motion cues are important in face perception and that these cues are likely integrated in the representation of facial identity. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato
2010-01-01
The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…
Bell's Palsy in Children: Role of the School Nurse in Early Recognition and Referral
ERIC Educational Resources Information Center
Gordon, Shirley C.
2008-01-01
Bell's palsy is the most common condition affecting facial nerves. It is an acute, rapidly progressing, idiopathic, unilateral facial paralysis that is generally self-limiting and non-life threatening that occurs in all age groups (Okuwobi, Omole, & Griffith, 2003). The school nurse may be the first person to assess facial palsy and muscle…
ERIC Educational Resources Information Center
Yang, Manshu; Chow, Sy-Miin
2010-01-01
Facial electromyography (EMG) is a useful physiological measure for detecting subtle affective changes in real time. A time series of EMG data contains bursts of electrical activity that increase in magnitude when the pertinent facial muscles are activated. Whereas previous methods for detecting EMG activation are often based on deterministic or…
Larsen, Randy J; Kasimatis, Margaret; Frey, Kurt
1992-09-01
We examined the hypothesis that muscle contractions in the face influence subjective emotional experience. Previously, researchers have been critical of experiments designed to test this facial feedback hypothesis, particularly in terms of methodological problems that may lead to demand characteristics. In an effort to surmount these methodological problems Strack, Martin, and Stepper (1988) developed an experimental procedure whereby subjects were induced to contract facial muscles involved in the production of an emotional pattern, without being asked to actually simulate an emotion. Specifically, subjects were required to hold a pen in their teeth, which unobtrusively creates a contraction of the zygomaticus major muscles, the muscles involved in the production of a human smile. This manipulation minimises the likelihood that subjects are able to interpret their zygomaticus contractions as representing a particular emotion, thereby preventing subjects from determining the purpose of the experiment. Strack et al. (1988) found support for the facial feedback hypothesis applied to pleasant affect, in that subjects in the pen-in-teeth condition rated humorous cartoons as being funnier than subjects in the control condition (in which zygomaticus contractions were inhibited). The present study represents an extension of this nonobtrusive methodology to an investigation of the facial feedback of unpleasant affect. Consistent with the Strack et al. procedure, we wanted to have subjects furrow their brow without actually instructing them to do so and without asking them to produce any emotional facial pattern at all. This was achieved by attaching two golf tees to the subject's brow region (just above the inside comer of each eye) and then instructing them to touch the tips of the golf tees together as part of a "divided-attention" experiment. Touching the tips of the golf tees together could only be achieved by a contraction of the corrugator supercilii muscles, the muscles involved in the production of a sad emotional facial pattern. Subjects reported significantly more sadness in response to aversive photographs while touching the tips of the golf tees together than under conditions which inhibited corrugator contractions. These results provide evidence, using a new and unobtrusive manipulation, that facial feedback operates for unpleasant affect to a degree similar to that previously found for pleasant affect.
Haunted by a doppelgänger: irrelevant facial similarity affects rule-based judgments.
von Helversen, Bettina; Herzog, Stefan M; Rieskamp, Jörg
2014-01-01
Judging other people is a common and important task. Every day professionals make decisions that affect the lives of other people when they diagnose medical conditions, grant parole, or hire new employees. To prevent discrimination, professional standards require that decision makers render accurate and unbiased judgments solely based on relevant information. Facial similarity to previously encountered persons can be a potential source of bias. Psychological research suggests that people only rely on similarity-based judgment strategies if the provided information does not allow them to make accurate rule-based judgments. Our study shows, however, that facial similarity to previously encountered persons influences judgment even in situations in which relevant information is available for making accurate rule-based judgments and where similarity is irrelevant for the task and relying on similarity is detrimental. In two experiments in an employment context we show that applicants who looked similar to high-performing former employees were judged as more suitable than applicants who looked similar to low-performing former employees. This similarity effect was found despite the fact that the participants used the relevant résumé information about the applicants by following a rule-based judgment strategy. These findings suggest that similarity-based and rule-based processes simultaneously underlie human judgment.
De Winter, François-Laurent; Van den Stock, Jan; de Gelder, Beatrice; Peeters, Ronald; Jastorff, Jan; Sunaert, Stefan; Vanduffel, Wim; Vandenberghe, Rik; Vandenbulcke, Mathieu
2016-09-01
In the healthy brain, modulatory influences from the amygdala commonly explain enhanced activation in face-responsive areas by emotional facial expressions relative to neutral expressions. In the behavioral variant frontotemporal dementia (bvFTD) facial emotion recognition is impaired and has been associated with atrophy of the amygdala. By combining structural and functional MRI in 19 patients with bvFTD and 20 controls we investigated the neural effects of emotion in face-responsive cortex and its relationship with amygdalar gray matter (GM) volume in neurodegeneration. Voxel-based morphometry revealed decreased GM volume in anterior medio-temporal regions including amygdala in patients compared to controls. During fMRI, we presented dynamic facial expressions (fear and chewing) and their spatiotemporally scrambled versions. We found enhanced activation for fearful compared to neutral faces in ventral temporal cortex and superior temporal sulcus in controls, but not in patients. In the bvFTD group left amygdalar GM volume correlated positively with emotion-related activity in left fusiform face area (FFA). This correlation was amygdala-specific and driven by GM in superficial and basolateral (BLA) subnuclei, consistent with reported amygdalar-cortical networks. The data suggests that anterior medio-temporal atrophy in bvFTD affects emotion processing in distant posterior areas. Copyright © 2016 Elsevier Ltd. All rights reserved.
How face blurring affects body language processing of static gestures in women and men.
Proverbio, A M; Ornaghi, L; Gabaro, V
2018-05-14
The role of facial coding in body language comprehension was investigated by ERP recordings in 31 participants viewing 800 photographs of gestures (iconic, deictic and emblematic), which could be congruent or incongruent with their caption. Facial information was obscured by blurring in half of the stimuli. The task consisted of evaluating picture/caption congruence. Quicker response times were observed in women than in men to congruent stimuli, and a cost for incongruent vs. congruent stimuli was found only in men. Face obscuration did not affect accuracy in women as reflected by omission percentages, nor reduced their cognitive potentials, thus suggesting a better comprehension of face deprived pantomimes. N170 response (modulated by congruity and face presence) peaked later in men than in women. Late Positivity was much larger for congruent stimuli in the female brain, regardless of face blurring. Face presence specifically activated the right superior temporal and fusiform gyri, cingulate cortex and insula, according to source reconstruction. These regions have been reported to be insufficiently activated in face-avoiding individuals with social deficits. Overall, the results corroborate the hypothesis that females might be more resistant to the lack of facial information or better at understanding body language in face-deprived social information.
Relationships between alexithymia, affect recognition, and empathy after traumatic brain injury.
Neumann, Dawn; Zupan, Barbra; Malec, James F; Hammond, Flora
2014-01-01
To determine (1) alexithymia, affect recognition, and empathy differences in participants with and without traumatic brain injury (TBI); (2) the amount of affect recognition variance explained by alexithymia; and (3) the amount of empathy variance explained by alexithymia and affect recognition. Sixty adults with moderate-to-severe TBI; 60 age and gender-matched controls. Participants were evaluated for alexithymia (difficulty identifying feelings, difficulty describing feelings, and externally-oriented thinking); facial and vocal affect recognition; and affective and cognitive empathy (empathic concern and perspective-taking, respectively). Participants with TBI had significantly higher alexithymia; poorer facial and vocal affect recognition; and lower empathy scores. For TBI participants, facial and vocal affect recognition variances were significantly explained by alexithymia (12% and 8%, respectively); however, the majority of the variances were accounted for by externally-oriented thinking alone. Affect recognition and alexithymia significantly accounted for 16.5% of cognitive empathy. Again, the majority of the variance was primarily explained by externally-oriented thinking. Affect recognition and alexithymia did not explain affective empathy. Results suggest that people who have a tendency to avoid thinking about emotions (externally-oriented thinking) are more likely to have problems recognizing others' emotions and assuming others' points of view. Clinical implications are discussed.
Etchepare, Aurore; Prouteau, Antoinette
2018-04-01
Social cognition has received growing interest in many conditions in recent years. However, this construct still suffers from a considerable lack of consensus, especially regarding the dimensions to be studied and the resulting methodology of clinical assessment. Our review aims to clarify the distinctiveness of the dimensions of social cognition. Based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statements, a systematic review was conducted to explore the factor structure of social cognition in the adult general and clinical populations. The initial search provided 441 articles published between January 1982 and March 2017. Eleven studies were included, all conducted in psychiatric populations and/or healthy participants. Most studies were in favor of a two-factor solution. Four studies drew a distinction between low-level (e.g., facial emotion/prosody recognition) and high-level (e.g., theory of mind) information processing. Four others reported a distinction between affective (e.g., facial emotion/prosody recognition) and cognitive (e.g., false beliefs) information processing. Interestingly, attributional style was frequently reported as an additional separate factor of social cognition. Results of factor analyses add further support for the relevance of models differentiating level of information processing (low- vs. high-level) from nature of processed information (affective vs. cognitive). These results add to a significant body of empirical evidence from developmental, clinical research and neuroimaging studies. We argue the relevance of integrating low- versus high-level processing with affective and cognitive processing in a two-dimensional model of social cognition that would be useful for future research and clinical practice. (JINS, 2018, 24, 391-404).
Flom, Ross; Gartman, Peggy
2016-03-01
Several studies have examined dogs' (Canis lupus familiaris) comprehension and use of human communicative cues. Relatively few studies have, however, examined the effects of human affective behavior (i.e., facial and vocal expressions) on dogs' exploratory and point-following behavior. In two experiments, we examined dogs' frequency of following an adult's pointing gesture in locating a hidden reward or treat when it occurred silently, or when it was paired with a positive or negative facial and vocal affective expression. Like prior studies, the current results demonstrate that dogs reliably follow human pointing cues. Unlike prior studies, the current results also demonstrate that the addition of a positive affective facial and vocal expression, when paired with a pointing gesture, did not reliably increase dogs' frequency of locating a hidden piece of food compared to pointing alone. In addition, and within the negative facial and vocal affect conditions of Experiment 1 and 2, dogs were delayed in their exploration, or approach, toward a baited or sham-baited bowl. However, in Experiment 2, dogs continued to follow an adult's pointing gesture, even when paired with a negative expression, as long as the attention-directing gesture referenced a baited bowl. Together these results suggest that the addition of affective information does not significantly increase or decrease dogs' point-following behavior. Rather these results demonstrate that the presence or absence of affective expressions influences a dogs' exploratory behavior and the presence or absence of reward affects whether they will follow an unfamiliar adult's attention-directing gesture.
Reconstruction of facial nerve injuries in children.
Fattah, Adel; Borschel, Gregory H; Zuker, Ron M
2011-05-01
Facial nerve trauma is uncommon in children, and many spontaneously recover some function; nonetheless, loss of facial nerve activity leads to functional impairment of ocular and oral sphincters and nasal orifice. In many cases, the impediment posed by facial asymmetry and reduced mimetic function more significantly affects the child's psychosocial interactions. As such, reconstruction of the facial nerve affords great benefits in quality of life. The therapeutic strategy is dependent on numerous factors, including the cause of facial nerve injury, the deficit, the prognosis for recovery, and the time elapsed since the injury. The options for treatment include a diverse range of surgical techniques including static lifts and slings, nerve repairs, nerve grafts and nerve transfers, regional, and microvascular free muscle transfer. We review our strategies for addressing facial nerve injuries in children.
NASA Astrophysics Data System (ADS)
Khan, Masood Mehmood; Ward, Robert D.; Ingleby, Michael
The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.
Enhanced subliminal emotional responses to dynamic facial expressions.
Sato, Wataru; Kubota, Yasutaka; Toichi, Motomi
2014-01-01
Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1) and 30 (Experiment 2) ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.
von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L
2015-04-01
Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2) = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.
Haldane, Morgan; Jogia, Jigar; Cobb, Annabel; Kozuch, Eliza; Kumari, Veena; Frangou, Sophia
2008-01-01
Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation.
[Treatment of idiopathic peripheral facial nerve paralysis (Bell's palsy)].
Meyer, Martin Willy; Hahn, Christoffer Holst
2013-01-28
Bell's palsy is defined as an idiopathic peripheral facial nerve paralysis of sudden onset. It affects 11-40 persons per 100,000 per annum. Many patients recover without intervention; however, up to 30% have poor recovery of facial muscle control and experience facial disfigurement. The aim of this study was to make an overview of which pharmacological treatments have been used to improve outcomes. The available evidence from randomized controlled trials shows significant benefit from treating Bell's palsy with corticosteroids but shows no benefit from antivirals.
Proverbio, Alice Mado; Mado Proverbio, C A Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto
2015-10-15
The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding.
Mado Proverbio, C.A. Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto
2015-01-01
The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding. PMID:26469712
Mapp, Oni M.; Wanner, Sarah J.; Rohrschneider, Monica R.; Prince, Victoria E.
2011-01-01
The facial branchiomotor neurons undergo a characteristic tangential migration in the vertebrate hindbrain. Several signaling mechanisms have been implicated in this process, including the non-canonical Wnt/planar cell polarity (PCP) pathway. However, the role of this signaling pathway in controlling the dynamics of these neurons is unclear. Here, we describe the cellular dynamics of the facial neurons as they migrate, focusing on the speed and direction of migration, extension of protrusions, cell shape and orientation. Furthermore, we show that the PET/LIM domain protein Prickle1b (Pk1b) is required for several aspects of these migratory behaviors, including cell orientation. However, we find that centrosome localization is not significantly affected by disruption of Pk1b function, suggesting that polarization of the neurons is not completely lost. Together, our data suggest that Pk1b function may be required to integrate the multiple migratory cues received by the neurons into polarization instructions for proper posterior movement. PMID:20503357
Expanded flap to repair facial scar left by radiotherapy of hemangioma.
Zhao, Donghong; Ma, Xinrong; Li, Jiang; Zhang, Lingfeng; Zhu, Baozhen
2014-09-01
This study explored the feasibility and clinical efficacy of expanded flap to repair facial scar left by radiotherapy of hemangioma. From March 2000 to April 2011, 13 cases of facial cicatrices left by radiotherapy of hemangioma have been treated with implantation surgery of facial skin dilator under local anesthesia. After water flood expansion for 1-2 months, resection of facial scar was performed, and wound repairing with expansion flap transfer was done. Thirteen patients were followed up from 5 months to 3 years. All patients tolerated flap transfer well; no contracture occurred during the facial expansion flap transfer. The incision scar was not obvious, and its color and texture were identical to surrounding skin. In conclusion, the use of expanded flap transfer to repair the facial scar left by radiotherapy of hemangioma is advantageous due to its simplicity, flexibility, and large area of repairing. This method does not affect the subsequent facial appearance.
Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte
2018-04-01
It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and rely more on the facial cues of audio-visual emotional stimuli. Two groups of young adult CD CI users-early signers (ES CI users; n = 11) and late signers (LS CI users; n = 10)-and a group of hearing, non-signing, age-matched controls (n = 12) performed an emotion recognition task with auditory, visual, and cross-modal emotionally congruent and incongruent speech stimuli. On different trials, participants categorized either the facial or the vocal expressions. The ES CI users more accurately recognized affective prosody than the LS CI users in the presence of congruent facial information. Furthermore, the ES CI users, but not the LS CI users, gained more than the controls from congruent visual stimuli when recognizing affective prosody. Both CI groups performed overall worse than the controls in recognizing affective prosody. These results suggest that early sign language experience affects multisensory emotion perception in CD CI users.
Proposal of Self-Learning and Recognition System of Facial Expression
NASA Astrophysics Data System (ADS)
Ogawa, Yukihiro; Kato, Kunihito; Yamamoto, Kazuhiko
We describe realization of more complicated function by using the information acquired from some equipped unripe functions. The self-learning and recognition system of the human facial expression, which achieved under the natural relation between human and robot, are proposed. The robot with this system can understand human facial expressions and behave according to their facial expressions after the completion of learning process. The system modelled after the process that a baby learns his/her parents’ facial expressions. Equipping the robot with a camera the system can get face images and equipping the CdS sensors on the robot’s head the robot can get the information of human action. Using the information of these sensors, the robot can get feature of each facial expression. After self-learning is completed, when a person changed his facial expression in front of the robot, the robot operates actions under the relevant facial expression.
Surguladze, Simon A; Chkonia, Eka D; Kezeli, Archil R; Roinishvili, Maya O; Stahl, Daniel; David, Anthony S
2012-05-01
Abnormalities in visual processing have been found consistently in schizophrenia patients, including deficits in early visual processing, perceptual organization, and facial emotion recognition. There is however no consensus as to whether these abnormalities represent heritable illness traits and what their contribution is to psychopathology. Fifty patients with schizophrenia, 61 of their first-degree healthy relatives, and 50 psychiatrically healthy volunteers were tested with regard to facial affect (FA) discrimination and susceptibility to develop the color-contingent illusion [the McCollough Effect (ME)]. Both patients and relatives demonstrated significantly lower accuracy in FA discrimination compared with controls. There was also a significant effect of familiality: Participants from the same families had more similar accuracy scores than those who belonged to different families. Experiments with the ME showed that schizophrenia patients required longer time to develop the illusion than relatives and controls, which indicated poor visual adaptation in schizophrenia. Relatives were marginally slower than controls. There was no significant association between the measures of FA discrimination accuracy and ME in any of the participant groups. Facial emotion discrimination was associated with the degree of interpersonal problems, as measured by the Schizotypal Personality Questionnaire in relatives and healthy volunteers, whereas the ME was associated with the perceptual-cognitive symptoms of schizotypy and positive symptoms of schizophrenia. Our results support the heritability of FA discrimination deficits as a trait and indicate visual adaptation abnormalities in schizophrenia, which are symptom related.
Khan, Arif O; Aldahmesh, Mohammed A; Mohamed, Jawahir Y; Alkuraya, Fowzan S
2012-06-01
To correlate clinical examination with underlying genotype in asymptomatic females who are potential carriers of X-linked developmental cataract (Nance-Horan syndrome). An ophthalmologist blind to the pedigree performed comprehensive ophthalmic examination for 16 available family members (two affected and six asymptomatic females, five affected and three asymptomatic males). Facial features were also noted. Venous blood was collected for sequencing of the gene NHS. All seven affected family members had congenital or infantile cataract and facial dysmorphism (long face, bulbous nose, abnormal dentition). The six asymptomatic females ranged in age from 4-35 years old. Four had posterior Y-suture centered lens opacities; these four also exhibited the facial dysmorphism of the seven affected family members. The fifth asymptomatic girl had scattered fine punctate lens opacities (not centered on the Y-suture) while the sixth had clear lenses, and neither exhibited the facial dysmorphism. A novel NHS mutation (p.Lys744AsnfsX15 [c.2232delG]) was found in the seven patients with congenital or infantile cataract. This mutation was also present in the four asymptomatic girls with Y-centered lens opacities but not in the other two asymptomatic girls or in the three asymptomatic males (who had clear lenses). Lens opacities centered around the posterior Y-suture in the context of certain facial features were sensitive and specific clinical signs of carrier status for NHS mutation in asymptomatic females. Lens opacities that did not have this characteristic morphology in a suspected female carrier were not a carrier sign, even in the context of her affected family members.
Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal
2010-12-15
A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD and the relation with abnormal SF processing. We investigated whether young ASD children show abnormalities in low spatial frequency (LSF, global) and high spatial frequency (HSF, detailed) processing and explored whether these are crucially involved in the early development of face processing. Three- to 4-year-old children with ASD (n = 22) were compared with developmentally delayed children without ASD (n = 17). Spatial frequency processing was studied by recording visual evoked potentials from visual brain areas while children passively viewed gratings (HSF/LSF). In addition, children watched face stimuli with different expressions, filtered to include only HSF or LSF. Enhanced activity in visual brain areas was found in response to HSF versus LSF information in children with ASD, in contrast to control subjects. Furthermore, facial-expression processing was also primarily driven by detail in ASD. Enhanced visual processing of detailed (HSF) information is present early in ASD and occurs for neutral (gratings), as well as for socially relevant stimuli (facial expressions). These data indicate that there is a general abnormality in visual SF processing in early ASD and are in agreement with suggestions that a fast LSF subcortical face processing route might be affected in ASD. This could suggest that abnormal visual processing is causative in the development of social problems in ASD. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Crouzon syndrome: a social stigma.
Pandey, Neelisha; Pandey, Ramesh Kumar; Singh, Rajeev Kumar; Shah, Naveen Kumar
2012-10-10
Crouzon syndrome is a rare genetic disorder caused due to genetic mutations. It is characterised by partial hearing loss, dry eyes, strabismus and underdevelopment of the upper jaw with facial deformities and malocclusion. These facial deformities greatly affect the social and emotional development of the affected child. The present case report highlights the social problems faced by a child suffering with Crouzon syndrome.
Beauty and the beast: Psychobiologic and evolutionary perspectives on body dysmorphic disorder.
Stein, Dan J; Carey, Paul D; Warwick, James
2006-06-01
Body dysmorphic disorder (BDD) is characterized by preoccupation with a defect in appearance. Concepts of beauty play a particularly crucial role in humans' mental and social life, and may have specific psychobiologic and evolutionary underpinnings. In particular, there is a growing literature on the neurocircuitry underpinning the body schema, body image and facial expression processing, and aesthetic and symmetry judgments. Speculatively, disruptions in cognitive-affective processes relevant to judgements about physical beauty lead to BDD.
Anaplastology in times of facial transplantation: Still a reasonable treatment option?
Toso, Sabine Maria; Menzel, Kerstin; Motzkus, Yvonne; Klein, Martin; Menneking, Horst; Raguse, Jan-Dirk; Nahles, Susanne; Hoffmeister, Bodo; Adolphs, Nicolai
2015-09-01
Optimum functional and aesthetic facial reconstruction is still a challenge in patients who suffer from inborn or acquired facial deformity. It is known that functional and aesthetic impairment can result in significant psychosocial strain, leading to the social isolation of patients who are affected by major facial deformities. Microvascular techniques and increasing experience in facial transplantation certainly contribute to better restorative outcomes. However, these technologies also have some drawbacks, limitations and unsolved problems. Extensive facial defects which include several aesthetic units and dentition can be restored by combining dental prostheses and anaplastology, thus providing an adequate functional and aesthetic outcome in selected patients without the drawbacks of major surgical procedures. Referring to some representative patient cases, it is shown how extreme facial disfigurement after oncological surgery can be palliated by combining intraoral dentures with extraoral facial prostheses using individualized treatment and without the need for major reconstructive surgery. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Johnson, Kerri L.; McKay, Lawrie S.; Pollick, Frank E.
2011-01-01
Gender stereotypes have been implicated in sex-typed perceptions of facial emotion. Such interpretations were recently called into question because facial cues of emotion are confounded with sexually dimorphic facial cues. Here we examine the role of visual cues and gender stereotypes in perceptions of biological motion displays, thus overcoming…
Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.
2015-01-01
Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…
Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong
2017-01-01
Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces. PMID:28249033
Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong
2017-01-01
Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces.
... such as thickened nasal drainage, nasal congestion and facial pain or pressure . Because sinusitis is often preceded by, ... drainage that is thick and discolored, or purulent Facial pain, pressure, or fullness, that often affects the cheeks, ...
El Haj, Mohamad; Daoudi, Mohamed; Gallouj, Karim; Moustafa, Ahmed A; Nandrino, Jean-Louis
2018-05-11
Thanks to the current advances in the software analysis of facial expressions, there is a burgeoning interest in understanding emotional facial expressions observed during the retrieval of autobiographical memories. This review describes the research on facial expressions during autobiographical retrieval showing distinct emotional facial expressions according to the characteristics of retrieved memoires. More specifically, this research demonstrates that the retrieval of emotional memories can trigger corresponding emotional facial expressions (e.g. positive memories may trigger positive facial expressions). Also, this study demonstrates the variations of facial expressions according to specificity, self-relevance, or past versus future direction of memory construction. Besides linking research on facial expressions during autobiographical retrieval to cognitive and affective characteristics of autobiographical memory in general, this review positions this research within the broader context research on the physiologic characteristics of autobiographical retrieval. We also provide several perspectives for clinical studies to investigate facial expressions in populations with deficits in autobiographical memory (e.g. whether autobiographical overgenerality in neurologic and psychiatric populations may trigger few emotional facial expressions). In sum, this review paper demonstrates how the evaluation of facial expressions during autobiographical retrieval may help understand the functioning and dysfunctioning of autobiographical memory.
Young, Garry
2009-09-01
Explanations of Capgras delusion and prosopagnosia typically incorporate a dual-route approach to facial recognition in which a deficit in overt or covert processing in one condition is mirror-reversed in the other. Despite this double dissociation, experiences of either patient-group are often reported in the same way--as lacking a sense of familiarity toward familiar faces. In this paper, deficits in the facial processing of these patients are compared to other facial recognition pathologies, and their experiential characteristics mapped onto the dual-route model in order to provide a less ambiguous link between facial processing and experiential content. The paper concludes that the experiential states of Capgras delusion, prosopagnosia, and related facial pathologies are quite distinct, and that this descriptive distinctiveness finds explanatory equivalence at the level of anatomical and functional disruption within the face recognition system. The role of skin conductance response (SCR) as a measure of 'familiarity' is also clarified.
Attention to emotion and non-Western faces: revisiting the facial feedback hypothesis.
Dzokoto, Vivian; Wallace, David S; Peters, Laura; Bentsi-Enchill, Esi
2014-01-01
In a modified replication of Strack, Martin, and Stepper's demonstration of the Facial Feedback Hypothesis (1988), we investigated the effect of attention to emotion on the facial feedback process in a non-western cultural setting. Participants, recruited from two universities in Ghana, West Africa, gave self-reports of their perceived levels of attention to emotion, and then completed cartoon-rating tasks while randomly assigned to smiling, frowning, or neutral conditions. While participants with low Attention to Emotion scores displayed the usual facial feedback effect (rating cartoons as funnier when in the smiling compared to the frowning condition), the effect was not present in individuals with high Attention to Emotion. The findings indicate that (1) the facial feedback process can occur in contexts beyond those in which the phenomenon has previously been studied, and (2) aspects of emotion regulation, such as Attention to Emotion can interfere with the facial feedback process.
2014-01-01
Background Alexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia. Results Forty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3 seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions. Conclusion The present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons. PMID:24629094
ERIC Educational Resources Information Center
Poljac, Ervin; Poljac, Edita; Wagemans, Johan
2013-01-01
Autism spectrum disorder (ASD) is among other things characterized by specific impairments in emotion processing. It is not clear, however, to what extent the typical decline in affective functioning is related to the specific autistic traits. We employed "The Autism Spectrum-Quotient" (AQ) to quantify autistic traits in a group of 500…
Perceptual, Categorical, and Affective Processing of Ambiguous Smiling Facial Expressions
ERIC Educational Resources Information Center
Calvo, Manuel G.; Fernandez-Martin, Andres; Nummenmaa, Lauri
2012-01-01
Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of…
Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea
2017-04-01
Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.
Duval, Elizabeth R; Lovelace, Christopher T; Aarant, Justin; Filion, Diane L
2013-12-01
The purpose of this study was to investigate the effects of both facial expression and face gender on startle eyeblink response patterns at varying lead intervals (300, 800, and 3500ms) indicative of attentional and emotional processes. We aimed to determine whether responses to affective faces map onto the Defense Cascade Model (Lang et al., 1997) to better understand the stages of processing during affective face viewing. At 300ms, there was an interaction between face expression and face gender with female happy and neutral faces and male angry faces producing inhibited startle. At 3500ms, there was a trend for facilitated startle during angry compared to neutral faces. These findings suggest that affective expressions are perceived differently in male and female faces, especially at short lead intervals. Future studies investigating face processing should take both face gender and expression into account. © 2013.
García-Rodríguez, Beatriz; Guillén, Carmen Casares; Barba, Rosa Jurado; io Valladolid, Gabriel Rub; Arjona, José Antonio Molina; Ellgring, Heiner
2012-02-15
There is evidence that visuo-spatial capacity can become overloaded when processing a secondary visual task (Dual Task, DT), as occurs in daily life. Hence, we investigated the influence of the visuo-spatial interference in the identification of emotional facial expressions (EFEs) in early stages of Parkinson's disease (PD). We compared the identification of 24 emotional faces that illustrate six basic emotions in, unmedicated recently diagnosed PD patients (16) and healthy adults (20), under two different conditions: a) simple EFE identification, and b) identification with a concurrent visuo-spatial task (Corsi Blocks). EFE identification by PD patients was significantly worse than that of healthy adults when combined with another visual stimulus. Published by Elsevier B.V.
Cooper, Nicholas R.; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio
2013-01-01
The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism. PMID:23630489
Cooper, Nicholas R; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio
2013-01-01
The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism.
Emotional Processing of Infants Displays in Eating Disorders
Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet
2014-01-01
Aim The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs). Background Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants. Method A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded. Results No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip. Conclusion People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets. PMID:25463051
Flack, Tessa R; Andrews, Timothy J; Hymers, Mark; Al-Mosaiwi, Mohammed; Marsden, Samuel P; Strachan, James W A; Trakulpipat, Chayanit; Wang, Liang; Wu, Tian; Young, Andrew W
2015-08-01
The face-selective region of the right posterior superior temporal sulcus (pSTS) plays an important role in analysing facial expressions. However, it is less clear how facial expressions are represented in this region. In this study, we used the face composite effect to explore whether the pSTS contains a holistic or feature-based representation of facial expression. Aligned and misaligned composite images were created from the top and bottom halves of faces posing different expressions. In Experiment 1, participants performed a behavioural matching task in which they judged whether the top half of two images was the same or different. The ability to discriminate the top half of the face was affected by changes in the bottom half of the face when the images were aligned, but not when they were misaligned. This shows a holistic behavioural response to expression. In Experiment 2, we used fMR-adaptation to ask whether the pSTS has a corresponding holistic neural representation of expression. Aligned or misaligned images were presented in blocks that involved repeating the same image or in which the top or bottom half of the images changed. Increased neural responses were found in the right pSTS regardless of whether the change occurred in the top or bottom of the image, showing that changes in expression were detected across all parts of the face. However, in contrast to the behavioural data, the pattern did not differ between aligned and misaligned stimuli. This suggests that the pSTS does not encode facial expressions holistically. In contrast to the pSTS, a holistic pattern of response to facial expression was found in the right inferior frontal gyrus (IFG). Together, these results suggest that pSTS reflects an early stage in the processing of facial expression in which facial features are represented independently. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia
2014-01-01
Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field. PMID:25452722
Gray, Nicola S.; Snowden, Robert J.
2017-01-01
Psychopathic individuals show a range of affective processing deficits, typically associated with the interpersonal/affective component of psychopathy. However, previous research has been inconsistent as to whether psychopathy, within both offender and community populations, is associated with deficient autonomic responses to the simple presentation of affective stimuli. Changes in pupil diameter occur in response to emotionally arousing stimuli and can be used as an objective indicator of physiological reactivity to emotion. This study used pupillometry to explore whether psychopathic traits within a community sample were associated with hypo-responsivity to the affective content of stimuli. Pupil activity was recorded for 102 adult (52 female) community participants in response to affective (both negative and positive affect) and affectively neutral stimuli, that included images of scenes, static facial expressions, dynamic facial expressions and sound-clips. Psychopathic traits were measured using the Triarchic Psychopathy Measure. Pupil diameter was larger in response to negative stimuli, but comparable pupil size was demonstrated across pleasant and neutral stimuli. A linear relationship between subjective arousal and pupil diameter was found in response to sound-clips, but was not evident in response to scenes. Contrary to predictions, psychopathy was unrelated to emotional modulation of pupil diameter across all stimuli. The findings were the same when participant gender was considered. This suggests that psychopathy within a community sample is not associated with autonomic hypo-responsivity to affective stimuli, and this effect is discussed in relation to later defensive/appetitive mobilisation deficits. PMID:28118366
Goldstein-Piekarski, Andrea N.; Greer, Stephanie M.; Saletin, Jared M.
2015-01-01
Facial expressions represent one of the most salient cues in our environment. They communicate the affective state and intent of an individual and, if interpreted correctly, adaptively influence the behavior of others in return. Processing of such affective stimuli is known to require reciprocal signaling between central viscerosensory brain regions and peripheral-autonomic body systems, culminating in accurate emotion discrimination. Despite emerging links between sleep and affective regulation, the impact of sleep loss on the discrimination of complex social emotions within and between the CNS and PNS remains unknown. Here, we demonstrate in humans that sleep deprivation impairs both viscerosensory brain (anterior insula, anterior cingulate cortex, amygdala) and autonomic-cardiac discrimination of threatening from affiliative facial cues. Moreover, sleep deprivation significantly degrades the normally reciprocal associations between these central and peripheral emotion-signaling systems, most prominent at the level of cardiac-amygdala coupling. In addition, REM sleep physiology across the sleep-rested night significantly predicts the next-day success of emotional discrimination within this viscerosensory network across individuals, suggesting a role for REM sleep in affective brain recalibration. Together, these findings establish that sleep deprivation compromises the faithful signaling of, and the “embodied” reciprocity between, viscerosensory brain and peripheral autonomic body processing of complex social signals. Such impairments hold ecological relevance in professional contexts in which the need for accurate interpretation of social cues is paramount yet insufficient sleep is pervasive. PMID:26180190
Goldstein-Piekarski, Andrea N; Greer, Stephanie M; Saletin, Jared M; Walker, Matthew P
2015-07-15
Facial expressions represent one of the most salient cues in our environment. They communicate the affective state and intent of an individual and, if interpreted correctly, adaptively influence the behavior of others in return. Processing of such affective stimuli is known to require reciprocal signaling between central viscerosensory brain regions and peripheral-autonomic body systems, culminating in accurate emotion discrimination. Despite emerging links between sleep and affective regulation, the impact of sleep loss on the discrimination of complex social emotions within and between the CNS and PNS remains unknown. Here, we demonstrate in humans that sleep deprivation impairs both viscerosensory brain (anterior insula, anterior cingulate cortex, amygdala) and autonomic-cardiac discrimination of threatening from affiliative facial cues. Moreover, sleep deprivation significantly degrades the normally reciprocal associations between these central and peripheral emotion-signaling systems, most prominent at the level of cardiac-amygdala coupling. In addition, REM sleep physiology across the sleep-rested night significantly predicts the next-day success of emotional discrimination within this viscerosensory network across individuals, suggesting a role for REM sleep in affective brain recalibration. Together, these findings establish that sleep deprivation compromises the faithful signaling of, and the "embodied" reciprocity between, viscerosensory brain and peripheral autonomic body processing of complex social signals. Such impairments hold ecological relevance in professional contexts in which the need for accurate interpretation of social cues is paramount yet insufficient sleep is pervasive. Copyright © 2015 the authors 0270-6474/15/3510135-11$15.00/0.
Kang, Guanlan; Zhou, Xiaolin; Wei, Ping
2015-09-01
The present study investigated the effect of reward expectation and spatial orientation on the processing of emotional facial expressions, using a spatial cue-target paradigm. A colored cue was presented at the left or right side of the central fixation point, with its color indicating the monetary reward stakes of a given trial (incentive vs. non-incentive), followed by the presentation of an emotional facial target (angry vs. neutral) at a cued or un-cued location. Participants were asked to discriminate the emotional expression of the target, with the cue-target stimulus onset asynchrony being 200-300 ms in Experiment 1 and 950-1250 ms in Experiment 2a (without a fixation cue) and Experiment 2b (with a fixation cue), producing a spatial facilitation effect and an inhibition of return effect, respectively. The results of all the experiments revealed faster reaction times in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward to facilitate task performance. An interaction between reward expectation and the emotion of the target was evident in all the three experiments, with larger reward effects for angry faces than for neutral faces. This interaction was not affected by spatial orientation. These findings demonstrate that incentive motivation improves task performance and increases sensitivity to angry faces, irrespective of spatial orienting and reorienting processes.
Ibáñez, Agustín; Riveros, Rodrigo; Hurtado, Esteban; Gleichgerrcht, Ezequiel; Urquina, Hugo; Herrera, Eduar; Amoruso, Lucía; Reyes, Migdyrai Martin; Manes, Facundo
2012-01-30
Previous studies have reported facial emotion recognition impairments in schizophrenic patients, as well as abnormalities in the N170 component of the event-related potential. Current research on schizophrenia highlights the importance of complexly-inherited brain-based deficits. In order to examine the N170 markers of face structural and emotional processing, DSM-IV diagnosed schizophrenia probands (n=13), unaffected first-degree relatives from multiplex families (n=13), and control subjects (n=13) matched by age, gender and educational level, performed a categorization task which involved words and faces with positive and negative valence. The N170 component, while present in relatives and control subjects, was reduced in patients, not only for faces, but also for face-word differences, suggesting a deficit in structural processing of stimuli. Control subjects showed N170 modulation according to the valence of facial stimuli. However, this discrimination effect was found to be reduced both in patients and relatives. This is the first report showing N170 valence deficits in relatives. Our results suggest a generalized deficit affecting the structural encoding of faces in patients, as well as the emotion discrimination both in patients and relatives. Finally, these findings lend support to the notion that cortical markers of facial discrimination can be validly considered as vulnerability markers. © 2011 Elsevier Ireland Ltd. All rights reserved.
Jenkins, L M; Kendall, A D; Kassel, M T; Patrón, V G; Gowins, J R; Dion, C; Shankman, S A; Weisenbach, S L; Maki, P; Langenecker, S A
2018-01-01
Sex differences in emotion processing may play a role in women's increased risk for Major Depressive Disorder (MDD). However, studies of sex differences in brain mechanisms involved in emotion processing in MDD (or interactions of sex and diagnosis) are sparse. We conducted an event-related fMRI study examining the interactive and distinct effects of sex and MDD on neural activity during a facial emotion perception task. To minimize effects of current affective state and cumulative disease burden, we studied participants with remitted MDD (rMDD) who were early in the course of the illness. In total, 88 individuals aged 18-23 participated, including 48 with rMDD (32 female) and 40 healthy controls (HC; 25 female). fMRI revealed an interaction between sex and diagnosis for sad and neutral facial expressions in the superior frontal gyrus and left middle temporal gyrus. Results also revealed an interaction of sex with diagnosis in the amygdala. Data was from two sites, which might increase variability, but it also increases power to examine sex by diagnosis interactions. This study demonstrates the importance of taking sex differences into account when examining potential trait (or scar) mechanisms that could be useful in identifying individuals at-risk for MDD as well as for evaluating potential therapeutic innovations. Copyright © 2017 Elsevier B.V. All rights reserved.
van Ommen, M M; van Beilen, M; Cornelissen, F W; Smid, H G O M; Knegtering, H; Aleman, A; van Laar, T
2016-06-01
Little is known about visual hallucinations (VH) in psychosis. We investigated the prevalence and the role of bottom-up and top-down processing in VH. The prevailing view is that VH are probably related to altered top-down processing, rather than to distorted bottom-up processing. Conversely, VH in Parkinson's disease are associated with impaired visual perception and attention, as proposed by the Perception and Attention Deficit (PAD) model. Auditory hallucinations (AH) in psychosis, however, are thought to be related to increased attention. Our retrospective database study included 1119 patients with non-affective psychosis and 586 controls. The Community Assessment of Psychic Experiences established the VH rate. Scores on visual perception tests [Degraded Facial Affect Recognition (DFAR), Benton Facial Recognition Task] and attention tests [Response Set-shifting Task, Continuous Performance Test-HQ (CPT-HQ)] were compared between 75 VH patients, 706 non-VH patients and 485 non-VH controls. The lifetime VH rate was 37%. The patient groups performed similarly on cognitive tasks; both groups showed worse perception (DFAR) than controls. Non-VH patients showed worse attention (CPT-HQ) than controls, whereas VH patients did not perform differently. We did not find significant VH-related impairments in bottom-up processing or direct top-down alterations. However, the results suggest a relatively spared attentional performance in VH patients, whereas face perception and processing speed were equally impaired in both patient groups relative to controls. This would match better with the increased attention hypothesis than with the PAD model. Our finding that VH frequently co-occur with AH may support an increased attention-induced 'hallucination proneness'.
Cues of Fatigue: Effects of Sleep Deprivation on Facial Appearance
Sundelin, Tina; Lekander, Mats; Kecklund, Göran; Van Someren, Eus J. W.; Olsson, Andreas; Axelsson, John
2013-01-01
Study Objective: To investigate the facial cues by which one recognizes that someone is sleep deprived versus not sleep deprived. Design: Experimental laboratory study. Setting: Karolinska Institutet, Stockholm, Sweden. Participants: Forty observers (20 women, mean age 25 ± 5 y) rated 20 facial photographs with respect to fatigue, 10 facial cues, and sadness. The stimulus material consisted of 10 individuals (five women) photographed at 14:30 after normal sleep and after 31 h of sleep deprivation following a night with 5 h of sleep. Measurements: Ratings of fatigue, fatigue-related cues, and sadness in facial photographs. Results: The faces of sleep deprived individuals were perceived as having more hanging eyelids, redder eyes, more swollen eyes, darker circles under the eyes, paler skin, more wrinkles/fine lines, and more droopy corners of the mouth (effects ranging from b = +3 ± 1 to b = +15 ± 1 mm on 100-mm visual analog scales, P < 0.01). The ratings of fatigue were related to glazed eyes and to all the cues affected by sleep deprivation (P < 0.01). Ratings of rash/eczema or tense lips were not significantly affected by sleep deprivation, nor associated with judgements of fatigue. In addition, sleep-deprived individuals looked sadder than after normal sleep, and sadness was related to looking fatigued (P < 0.01). Conclusions: The results show that sleep deprivation affects features relating to the eyes, mouth, and skin, and that these features function as cues of sleep loss to other people. Because these facial regions are important in the communication between humans, facial cues of sleep deprivation and fatigue may carry social consequences for the sleep deprived individual in everyday life. Citation: Sundelin T; Lekander M; Kecklund G; Van Someren EJW; Olsson A; Axelsson J. Cues of fatigue: effects of sleep deprivation on facial appearance. SLEEP 2013;36(9):1355-1360. PMID:23997369
Loss of ADAMTS3 activity causes Hennekam lymphangiectasia-lymphedema syndrome 3.
Brouillard, Pascal; Dupont, Laura; Helaers, Raphael; Coulie, Richard; Tiller, George E; Peeden, Joseph; Colige, Alain; Vikkula, Miikka
2017-11-01
Primary lymphedema is due to developmental and/or functional defects in the lymphatic system. It may affect any part of the body, with predominance for the lower extremities. Twenty-seven genes have already been linked to primary lymphedema, either isolated, or as part of a syndrome. The proteins that they encode are involved in VEGFR3 receptor signaling. They account for about one third of all primary lymphedema cases, underscoring the existence of additional genetic factors. We used whole-exome sequencing to investigate the underlying cause in a non-consanguineous family with two children affected by lymphedema, lymphangiectasia and distinct facial features. We discovered bi-allelic missense mutations in ADAMTS3. Both were predicted to be highly damaging. These amino acid substitutions affect well-conserved residues in the prodomain and in the peptidase domain of ADAMTS3. In vitro, the mutant proteins were abnormally processed and sequestered within cells, which abolished proteolytic activation of pro-VEGFC. VEGFC processing is also affected by CCBE1 mutations that cause the Hennekam lymphangiectasia-lymphedema syndrome syndrome type1. Our data identifies ADAMTS3 as a novel gene that can be mutated in individuals affected by the Hennekam syndrome. These patients have distinctive facial features similar to those with mutations in CCBE1. Our results corroborate the recent in vitro and murine data that suggest a close functional interaction between ADAMTS3 and CCBE1 in triggering VEGFR3 signaling, a cornerstone for the differentiation and function of lymphatic endothelial cells. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Hemifacial Microsomia in a Cat.
Song, R B; Kent, M; Glass, E N; Davis, G J; Castro, F A; de Lahunta, A
2017-10-01
A 7-month-old domestic medium hair cat presented with facial asymmetry affecting the bony and soft tissue structures of the right side of the head including the maxilla, nose, eye and pinna of the ear. Additionally, neurological dysfunction of the facial and vestibulocochlear nerves on the affected side was present. A congenital malformation affecting the first and second embryologic pharyngeal arches was suspected. This is the first case of hemifacial microsomia of likely congenital origin reported in a cat. © 2017 Blackwell Verlag GmbH.
Fonzo, Gregory A.; Ramsawh, Holly J.; Flagan, Taru M.; Sullivan, Sarah G.; Letamendi, Andrea; Simmons, Alan N.; Paulus, Martin P.; Stein, Murray B.
2015-01-01
Background Although evidence exists for abnormal brain function across various anxiety disorders, direct comparison of neural function across diagnoses is needed to elicit abnormalities common across disorders and those distinct to a particular diagnosis. Aims To delineate common and distinct abnormalities within generalised anxiety (GAD), panic and social anxiety disorder (SAD) during affective processing. Method Fifty-nine adults (15 with GAD, 15 with panic disorder, 14 with SAD, and 15 healthy controls) underwent functional magnetic resonance imaging while completing a facial emotion matching task with fearful, angry and happy faces. Results Greater differential right amygdala activation to matching fearful v. happy facial expressions related to greater negative affectivity (i.e. trait anxiety) and was heightened across all anxiety disorder groups compared with controls. Collapsing across emotional face types, participants with panic disorder uniquely displayed greater posterior insula activation. Conclusions These preliminary results highlight a common neural basis for clinical anxiety in these diagnoses and also suggest the presence of disorder-specific dysfunction. PMID:25573399
Recognition memory for emotional and neutral faces: an event-related potential study.
Johansson, Mikael; Mecklinger, Axel; Treese, Anne-Cécile
2004-12-01
This study examined emotional influences on the hypothesized event-related potential (ERP) correlates of familiarity and recollection (Experiment 1) and the states of awareness (Experiment 2) accompanying recognition memory for faces differing in facial affect. Participants made gender judgments to positive, negative, and neutral faces at study and were in the test phase instructed to discriminate between studied and nonstudied faces. Whereas old-new discrimination was unaffected by facial expression, negative faces were recollected to a greater extent than both positive and neutral faces as reflected in the parietal ERP old-new effect and in the proportion of remember judgments. Moreover, emotion-specific modulations were observed in frontally recorded ERPs elicited by correctly rejected new faces that concurred with a more liberal response criterion for emotional as compared to neutral faces. Taken together, the results are consistent with the view that processes promoting recollection are facilitated for negative events and that emotion may affect recognition performance by influencing criterion setting mediated by the prefrontal cortex.
Baran Tatar, Zeynep; Yargıç, İlhan; Oflaz, Serap; Büyükgök, Deniz
2015-01-01
Interpersonal relationship disorders in adults with Attention Deficit Hyperactivity Disorder (ADHD) can be associated with the impairment of non-verbal communication. The purpose of our study was to compare the emotion recognition, facial recognition and neuropsychological assessments of adult ADHD patients with those of healthy controls, and to thus determine the effect of neuropsychological data on the recognition of emotional expressions. This study, which was based on a case-control model, was conducted with patients diagnosed with ADHD according to the DSM-IV-TR, being followed and monitored at the adult ADHD clinic of the Psychiatry Department of the Istanbul University Istanbul Medical Faculty Hospital. The study group consisted of 40 adults (27.5% female) between the ages of 20-65 (mean age 25.96 ± 6.07; education level: 15.02±2.34 years) diagnosed with ADHD, and 40 controls who were matched/similar with the study group with respect to age, gender, and education level. In the ADHD group, 14 (35%) of the patients had concomitant diseases. Pictures of Facial Affect, the Benton Face Recognition Test, and the Continuous Performance Test were used to respectively evaluate emotion recognition, facial recognition, and attention deficit and impulsivity of the patients. It was determined that, in comparison to the control group, the ADHD group made more mistakes in recognizing all types of emotional expressions and neutral expressions. The ADHD group also demonstrated more cognitive mistakes. Facial recognition was similar in both groups. It was determined that impulsivity had a significant effect on facial recognition. The social relationship disorders observed in ADHD can be affected by emotion recognition processes. In future studies, it may be possible to investigate the effects that early psychopharmacological and psychotherapeutic interventions administered for the main symptoms of ADHD have on the impairment of emotion recognition.
Waddell, George; Williamon, Aaron
2017-01-01
Judgments of music performance quality are commonly employed in music practice, education, and research. However, previous studies have demonstrated the limited reliability of such judgments, and there is now evidence that extraneous visual, social, and other “non-musical” features can unduly influence them. The present study employed continuous measurement techniques to examine how the process of forming a music quality judgment is affected by the manipulation of temporally specific visual cues. Video footage comprising an appropriate stage entrance and error-free performance served as the standard condition (Video 1). This footage was manipulated to provide four additional conditions, each identical save for a single variation: an inappropriate stage entrance (Video 2); the presence of an aural performance error midway through the piece (Video 3); the same error accompanied by a negative facial reaction by the performer (Video 4); the facial reaction with no corresponding aural error (Video 5). The participants were 53 musicians and 52 non-musicians (N = 105) who individually assessed the performance quality of one of the five randomly assigned videos via a digital continuous measurement interface and headphones. The results showed that participants viewing the “inappropriate” stage entrance made judgments significantly more quickly than those viewing the “appropriate” entrance, and while the poor entrance caused significantly lower initial scores among those with musical training, the effect did not persist long into the performance. The aural error caused an immediate drop in quality judgments that persisted to a lower final score only when accompanied by the frustrated facial expression from the pianist; the performance error alone caused a temporary drop only in the musicians' ratings, and the negative facial reaction alone caused no reaction regardless of participants' musical experience. These findings demonstrate the importance of visual information in forming evaluative and aesthetic judgments in musical contexts and highlight how visual cues dynamically influence those judgments over time. PMID:28487662
ERIC Educational Resources Information Center
Farber, Ellen A.; Moely, Barbara E.
Results of two studies investigating children's abilities to use different kinds of cues to infer another's affective state are reported in this paper. In the first study, 48 children (3, 4, and 6 to 7 years of age) were given three different kinds of tasks (interpersonal task, facial recognition task, and vocal recognition task). A cross-age…
Tsotsi, Stella; Kosmidis, Mary H; Bozikas, Vasilis P
2017-08-01
In schizophrenia, impaired facial affect recognition (FAR) has been associated with patients' overall social functioning. Interventions targeting attention or FAR per se have invariably yielded improved FAR performance in these patients. Here, we compared the effects of two interventions, one targeting FAR and one targeting attention-to-facial-features, with treatment-as-usual on patients' FAR performance. Thirty-nine outpatients with schizophrenia were randomly assigned to one of three groups: FAR intervention (training to recognize emotional information, conveyed by changes in facial features), attention-to-facial-features intervention (training to detect changes in facial features), and treatment-as-usual. Also, 24 healthy controls, matched for age and education, were assigned to one of the two interventions. Two FAR measurements, baseline and post-intervention, were conducted using an original experimental procedure with alternative sets of stimuli. We found improved FAR performance following the intervention targeting FAR in comparison to the other patient groups, which in fact was comparable to the pre-intervention performance of healthy controls in the corresponding intervention group. This improvement was more pronounced in recognizing fear. Our findings suggest that compared to interventions targeting attention, and treatment-as-usual, training programs targeting FAR can be more effective in improving FAR in patients with schizophrenia, particularly assisting them in perceiving threat-related information more accurately. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Braarud, Hanne Cecilie; Skotheim, Siv; Høie, Kjartan; Markhus, Maria Wik; Kjellevold, Marian; Graff, Ingvild Eide; Berle, Jan Øystein; Stormark, Kjell Morten
2017-08-01
Depression in the postpartum period involves feelings of sadness, anxiety and irritability, and attenuated feelings of pleasure and comfort with the infant. Even mild- to- moderate symptoms of depression seem to have an impact on caregivers affective availability and contingent responsiveness. The aim of the present study was to investigate non-depressed and sub-clinically depressed mothers interest and affective expression during contingent and non-contingent face-to-face interaction with their infant. The study utilized a double video (DV) set-up. The mother and the infant were presented with live real-time video sequences, which allowed for mutually responsive interaction between the mother and the infant (Live contingent sequences), or replay sequences where the interaction was set out of phase (Replay non-contingent sequences). The DV set-up consisted of five sequences: Live1-Replay1-Live2-Replay2-Live3. Based on their scores on the Edinburgh Postnatal Depression Scale (EPDS), the mothers were divided into a non-depressed and a sub-clinically depressed group (EPDS score≥6). A three-way split-plot ANOVA showed that the sub-clinically depressed mothers displayed the same amount of positive and negative facial affect independent of the quality of the interaction with the infants. The non-depressed mothers displayed more positive facial affect during the non-contingent than the contingent interaction sequences, while there was no such effect for negative facial affect. The results indicate that sub-clinically level depressive symptoms influence the mothers' affective facial expression during early face-to-face interaction with their infants. One of the clinical implications is to consider even sub-clinical depressive symptoms as a risk factor for mother-infant relationship disturbances. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-06
... Process To Develop Consumer Data Privacy Code of Conduct Concerning Facial Recognition Technology AGENCY... technology. This Notice announces the meetings to be held in February, March, April, May, and June 2014. The... promote trust regarding facial recognition technology in the commercial context.\\4\\ NTIA encourages...
Putting the face in context: Body expressions impact facial emotion processing in human infants.
Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias
2016-06-01
Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Pawah, Salil; Sikri, Arpit; Rexwal, Pushpanjali; Aggarwal, Prachi
2017-01-01
Along with function, aesthetics plays an important role in treating partially or completely edentulous patients. Ageing, trauma, tooth loss and neuromuscular disorders have a high impact on tonicity of facial musculature, elasticity of skin as well as function of muscles. Patients affected with Bell’s palsy face functional, aesthetic as well as psychological impairment. Common problems are the partial closure of upper eyelid, sagging of lower eyelid and drooping of angle of mouth leading to facial asymmetry, along with difficulty in eating, drinking and speaking. The key to aesthetic restoration is to support and harmonize the collapsed facial musculature with the help of various prosthodontic treatment approaches. This case report attempts to focus on treating completely edentulous patient affected with Bell’s palsy with special prosthesis supporting angle of mouth and lower eyelid using novel technique. PMID:28658922
Another Scale for the Assessment of Facial Paralysis? ADS Scale: Our Proposition, How to Use It.
Di Stadio, Arianna
2015-12-01
Several authors in the years propose different methods to evaluate areas and specific movement's disease in patient affected by facial palsy. Despite these efforts the House Brackmann is anyway the most used assessment in medical community. The aims of our study is the proposition and assessing a new rating Arianna Disease Scale (ADS) for the clinical evaluation of facial paralysis. Sixty patients affected by unilateral facial Bell paralysis were enrolled in a prospective study from 2012 to 2014. Their facial nerve function was evaluated with our assessment analysing facial district divided in upper, middle and lower third. We analysed different facial expressions. Each movement corresponded to the action of different muscles. The action of each muscle was scored from 0 to 1, with 0 corresponding from complete flaccid paralysis to muscle's normal function ending with a score of 1. Synkinesis was considered and evaluated also in the scale with a fixed 0.5 score. Our results considered ease and speed of evaluation of the assessment, the accuracy of muscle deficit and the ability to calculate synkinesis using a score. All the three observers agreed 100% in the highest degree of deficit. We found some discrepancies in intermediate score with 92% agreement in upper face, 87% in middle and 80% in lower face, where there were more muscles involved in movements. Our scale had some limitations linked to the small group of patients evaluated and we had a little difficulty understanding the intermediate score of 0.3 and 0.7. However, this was an accurate tool to quickly evaluate facial nerve function. This has potential as an alternative scale to and to diagnose facial nerve disorders.
Perceptions of midline deviations among different facial types.
Williams, Ryan P; Rinchuse, Daniel J; Zullo, Thomas G
2014-02-01
The correction of a deviated midline can involve complicated mechanics and a protracted treatment. The threshold below which midline deviations are considered acceptable might depend on multiple factors. The objective of this study was to evaluate the effect of facial type on laypersons' perceptions of various degrees of midline deviation. Smiling photographs of male and female subjects were altered to create 3 facial type variations (euryprosopic, mesoprosopic, and leptoprosopic) and deviations in the midline ranging from 0.0 to 4.0 mm. Evaluators rated the overall attractiveness and acceptability of each photograph. Data were collected from 160 raters. The overall threshold for the acceptability of a midline deviation was 2.92 ± 1.10 mm, with the threshold for the male subject significantly lower than that for the female subject. The euryprosopic facial type showed no decrease in mean attractiveness until the deviations were 2 mm or more. All other facial types were rated as decreasingly attractive from 1 mm onward. Among all facial types, the attractiveness of the male subject was only affected at deviations of 2 mm or greater; for the female subject, the attractiveness scores were significantly decreased at 1 mm. The mesoprosopic facial type was most attractive for the male subject but was the least attractive for the female subject. Facial type and sex may affect the thresholds at which a midline deviation is detected and above which a midline deviation is considered unacceptable. Both the euryprosopic facial type and male sex were associated with higher levels of attractiveness at relatively small levels of deviations. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Rodway, Paul; Wright, Lynn; Hardie, Scott
2003-12-01
The right hemisphere has often been viewed as having a dominant role in the processing of emotional information. Other evidence indicates that both hemispheres process emotional information but their involvement is valence specific, with the right hemisphere dealing with negative emotions and the left hemisphere preferentially processing positive emotions. This has been found under both restricted (Reuter-Lorenz & Davidson, 1981) and free viewing conditions (Jansari, Tranel, & Adophs, 2000). It remains unclear whether the valence-specific laterality effect is also sex specific or is influenced by the handedness of participants. To explore this issue we repeated Jansari et al.'s free-viewing laterality task with 78 participants. We found a valence-specific laterality effect in women but not men, with women discriminating negative emotional expressions more accurately when the face was presented on the left-hand side and discriminating positive emotions more accurately when those faces were presented on the right-hand side. These results indicate that under free viewing conditions women are more lateralised for the processing of facial emotion than are men. Handedness did not affect the lateralised processing of facial emotion. Finally, participants demonstrated a response bias on control trials, where facial emotion did not differ between the faces. Participants selected the left-hand side more frequently when they believed the expression was negative and the right-hand side more frequently when they believed the expression was positive. This response bias can cause a spurious valence-specific laterality effect which might have contributed to the conflicting findings within the literature.
Modeling 3D Facial Shape from DNA
Claes, Peter; Liberton, Denise K.; Daniels, Katleen; Rosana, Kerri Matthes; Quillen, Ellen E.; Pearson, Laurel N.; McEvoy, Brian; Bauchet, Marc; Zaidi, Arslan A.; Yao, Wei; Tang, Hua; Barsh, Gregory S.; Absher, Devin M.; Puts, David A.; Rocha, Jorge; Beleza, Sandra; Pereira, Rinaldo W.; Baynam, Gareth; Suetens, Paul; Vandermeulen, Dirk; Wagner, Jennifer K.; Boster, James S.; Shriver, Mark D.
2014-01-01
Human facial diversity is substantial, complex, and largely scientifically unexplained. We used spatially dense quasi-landmarks to measure face shape in population samples with mixed West African and European ancestry from three locations (United States, Brazil, and Cape Verde). Using bootstrapped response-based imputation modeling (BRIM), we uncover the relationships between facial variation and the effects of sex, genomic ancestry, and a subset of craniofacial candidate genes. The facial effects of these variables are summarized as response-based imputed predictor (RIP) variables, which are validated using self-reported sex, genomic ancestry, and observer-based facial ratings (femininity and proportional ancestry) and judgments (sex and population group). By jointly modeling sex, genomic ancestry, and genotype, the independent effects of particular alleles on facial features can be uncovered. Results on a set of 20 genes showing significant effects on facial features provide support for this approach as a novel means to identify genes affecting normal-range facial features and for approximating the appearance of a face from genetic markers. PMID:24651127
A causal role for the anterior mid-cingulate cortex in negative affect and cognitive control.
Tolomeo, Serenella; Christmas, David; Jentzsch, Ines; Johnston, Blair; Sprengelmeyer, Reiner; Matthews, Keith; Douglas Steele, J
2016-06-01
Converging evidence has linked the anterior mid-cingulate cortex to negative affect, pain and cognitive control. It has previously been proposed that this region uses information about punishment to control aversively motivated actions. Studies on the effects of lesions allow causal inferences about brain function; however, naturally occurring lesions in the anterior mid-cingulate cortex are rare. In two studies we therefore recruited 94 volunteers, comprising 15 patients with treatment-resistant depression who had received bilateral anterior cingulotomy, which consists of lesions made within the anterior mid-cingulate cortex, 20 patients with treatment-resistant depression who had not received surgery and 59 healthy control subjects. Using the Ekman 60 faces paradigm and two Stroop paradigms, we tested the hypothesis that patients who received anterior cingulotomy were impaired in recognizing negative facial affect expressions but not positive or neutral facial expressions, and impaired in Stroop cognitive control, with larger lesions being associated with more impairment. Consistent with this hypothesis, we found that larger volume lesions predicted more impairment in recognizing fear, disgust and anger, and no impairment in recognizing facial expressions of surprise or happiness. However, we found no impairment in recognizing expressions of sadness. Also consistent with the hypothesis, we found that larger volume lesions predicted impaired Stroop cognitive control. Notably, this relationship was only present when anterior mid-cingulate cortex lesion volume was defined as the overlap between cingulotomy lesion volume and Shackman's meta-analysis-derived binary masks for negative affect and cognitive control. Given substantial evidence from healthy subjects that the anterior mid-cingulate cortex is part of a network associated with the experience of negative affect and pain, engaging cognitive control processes for optimizing behaviour in the presence of such stimuli, our findings support the assertion that this region has a causal role in these processes. While the clinical justification for cingulotomy is empirical and not theoretical, it is plausible that lesions within a brain region associated with the subjective experience of negative affect and pain may be therapeutic for patients with otherwise intractable mood, anxiety and pain syndromes. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J
2017-12-01
The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.
Emotion understanding in postinstitutionalized Eastern European children
WISMER FRIES, ALISON B.; POLLAK, SETH D.
2005-01-01
To examine the effects of early emotional neglect on children’s affective development, we assessed children who had experienced institutionalized care prior to adoption into family environments. One task required children to identify photographs of facial expressions of emotion. A second task required children to match facial expressions to an emotional situation. Internationally adopted, postinstitutionalized children had difficulty identifying facial expressions of emotion. In addition, postinstitutionalized children had significant difficulty matching appropriate facial expressions to happy, sad, and fearful scenarios. However, postinstitutionalized children performed as well as comparison children when asked to identify and match angry facial expressions. These results are discussed in terms of the importance of emotional input early in life on later developmental organization. PMID:15487600
Olatunji, Bunmi O; Lohr, Jeffrey M; Sawchuk, Craig N; Westendorf, David H
2005-01-01
Two experiments examine use of an evaluative conditioning (EC) paradigm in the acquisition of fear and disgust responding to neutral facial expressions. In Experiment 1, 60 participants were randomly assigned to one of three evaluative learning conditions in which neutral facial expressions were paired with fearsome, disgusting, or neutral pictures. No statistically significant differences were detected between the three conditions. However, significant differences emerged within subjects as post-exposure of fear and disgust ratings were higher among expressions that were paired with pictorial stimuli. Experiment 2 sought to examine if an analogue sample of BII phobics would be more susceptible than nonphobic controls to fear and disgust EC utilizing a similar experimental design, given the co-occurrence of fear and disgust in BII-phobic responding. Results failed to demonstrate an EC effect specific to the analogue phobic group, although both groups showed an evaluative shift toward disgust for those facial expressions paired with BII-relevant pictures. Consistent with previous findings, examination of picture rating data suggested that analogue BII phobics rated the BII pictures as significantly more disgusting than fearful. The role of EC processes and a priori expectancy biases in the associative learning of disgust in BII phobia is discussed.
Magrelli, Silvia; Jermann, Patrick; Noris, Basilio; Ansermet, François; Hentsch, François; Nadel, Jacqueline; Billard, Aude
2013-01-01
This study investigates attention orienting to social stimuli in children with Autism Spectrum Conditions (ASC) during dyadic social interactions taking place in real-life settings. We study the effect of social cues that differ in complexity and distinguish between social cues produced by facial expressions of emotion and those produced during speech. We record the children's gazes using a head-mounted eye-tracking device and report on a detailed and quantitative analysis of the motion of the gaze in response to the social cues. The study encompasses a group of children with ASC from 2 to 11-years old (n = 14) and a group of typically developing (TD) children (n = 17) between 3 and 6-years old. While the two groups orient overtly to facial expressions, children with ASC do so to a lesser extent. Children with ASC differ importantly from TD children in the way they respond to speech cues, displaying little overt shifting of attention to speaking faces. When children with ASC orient to facial expressions, they show reaction times and first fixation lengths similar to those presented by TD children. However, children with ASC orient to speaking faces slower than TD children. These results support the hypothesis that individuals affected by ASC have difficulties processing complex social sounds and detecting intermodal correspondence between facial and vocal information. It also corroborates evidence that people with ASC show reduced overt attention toward social stimuli. PMID:24312064
Cutaneous Sensibility Changes in Bell's Palsy Patients.
Cárdenas Palacio, Carlos Andrés; Múnera Galarza, Francisco Alejandro
2017-05-01
Objective Bell's palsy is a cranial nerve VII dysfunction that renders the patient unable to control facial muscles from the affected side. Nevertheless, some patients have reported cutaneous changes in the paretic area. Therefore, cutaneous sensibility changes might be possible additional symptoms within the clinical presentation of this disorder. Accordingly, the aim of this research was to investigate the relationship between cutaneous sensibility and facial paralysis severity in these patients. Study Design Prospective longitudinal cohort study. Settings Tertiary care medical center. Subjects and Methods Twelve acute-onset Bell's palsy patients were enrolled from March to September 2009. In addition, 12 sex- and age-matched healthy volunteers were tested. Cutaneous sensibility was evaluated with pressure threshold and 2-point discrimination at 6 areas of the face. Facial paralysis severity was evaluated with the House-Brackmann scale. Results Statistically significant correlations based on the Spearman's test were found between facial paralysis severity and cutaneous sensitivity on forehead, eyelid, cheek, nose, and lip ( P < .05). Additionally, significant differences based on the Student's t test were observed between both sides of the face in 2-point discrimination on eyelid, cheek, and lip ( P < .05) in Bell's palsy patients but not in healthy subjects. Conclusion Such results suggest a possible relationship between the loss of motor control of the face and changes in facial sensory information processing. Such findings are worth further research about the neurophysiologic changes associated with the cutaneous sensibility disturbances of these patients.
Why do we laugh at misfortunes? An electrophysiological exploration of comic situation processing.
Manfredi, Mirella; Adorni, Roberta; Proverbio, Alice Mado; Proverbio, Alice
2014-08-01
The goal of the present study was to shed some light on a particular kind of humour, called slapstick, by measuring brain bioelectrical activity during the perception of funny vs. non-funny pictures involving misfortunate circumstances. According to our hypothesis, the element mostly providing a comic feature in a misfortunate situation is the facial expression of the victims: the observer׳s reaction will usually be laughing only if the victims will show a funny bewilderment face and not a painful or anger expression. Several coloured photographs depicting people involved in misfortunate situations were presented to 30 Italian healthy volunteers, while their EEG was recorded. Three different situations were considered: people showing a painful or an angry expression (Affective); people showing a bewilderment expression and, so, a comic look (Comic); people engaged in similar misfortunate situations but with no face visible (No Face). Results showed that the mean amplitude of both the posterior N170 and anterior N220 components was much larger in amplitude to comic pictures, than the other stimuli. This early response could be considered the first identification of a comic element and evidence of the compelling and automatic response that usually characterizes people amused reaction during a misfortune. In addition, we observed a larger P300 amplitude in response to comic than affective pictures, probably reflecting a more conscious processing of the comic element. Finally, no face pictures elicited an anteriorly distributed N400, which might reflect the effort to comprehend the nature of the situation displayed without any affective facial information, and a late positivity, possibly indexing a re-analysis processing of the unintelligible misfortunate situation (comic or unhappy) depicted in the No Face stimuli. These data support the hypothesis that the facial expression of the victims acts as a specific trigger for the amused feeling that observers usually experience when someone falls down. Overall, the data indicate the existence of a neural circuit that is capable of recognize and appreciate the comic element of a misfortunate situation in a group of young adults. Copyright © 2014 Elsevier Ltd. All rights reserved.
Asymmetric bias in perception of facial affect among Roman and Arabic script readers.
Heath, Robin L; Rouhana, Aida; Ghanem, Dana Abi
2005-01-01
The asymmetric chimeric faces test is used frequently as an indicator of right hemisphere involvement in the perception of facial affect, as the test is considered free of linguistic elements. Much of the original research with the asymmetric chimeric faces test was conducted with subjects reading left-to-right Roman script, i.e., English. As readers of right-to-left scripts, such as Arabic, demonstrated a mixed or weak rightward bias in judgements of facial affect, the influence of habitual scanning direction was thought to intersect with laterality. We administered the asymmetric chimeric faces test to 1239 adults who represented a range of script experience, i.e., Roman script readers (English and French), Arabic readers, bidirectional readers of Roman and Arabic scripts, and illiterates. Our findings supported the hypothesis that the bias in facial affect judgement is rooted in laterality, but can be influenced by script direction. Specifically, right-handed readers of Roman script demonstrated the greatest mean leftward score, and mixed-handed Arabic script readers demonstrated the greatest mean rightward score. Biliterates showed a gradual shift in asymmetric perception, as their scores fell between those of Roman and Arabic script readers, basically distributed in the order expected by their handedness and most often used script. Illiterates, whose only directional influence was laterality, showed a slight leftward bias.
Children’s Empathy Responses and their Understanding of Mother’s Emotions
Tully, Erin C.; Donohue, Meghan Rose; Garcia, Sarah E.
2014-01-01
This study investigated children’s empathic responses to their mother’s distress to provide insight about child factors that contribute to parental socialization of emotions. Four- to six-year-old children (N=82) observed their mother’s sadness and anger during a simulated emotional phone conversation. Children’s facial negative affect was rated and their heart rate variability was recorded during the conversation, and their emotion understanding of the conversation was measured through their use of negative emotion words and perspective-taking themes (i.e., discussing the causes or resolution of mother’s emotions) in narrative accounts of the conversation. There were positive quadratic relationships between HRV and ratings of facial affect, narrative references to mother’s negative emotions, and perspective-taking themes. High and low HRV were associated with high facial negative affect, suggesting well-regulated sympathy and poorly regulated personal distress empathic responses, respectively. Moderate HRV was associated with low facial negative affect, suggesting minimal empathic engagement. High and low HRV were associated with the highest probabilities of both emotion understanding indicators, suggesting both sympathy and personal distress responses to mother’s distress facilitate understanding of mother’s emotions. Personal distress may motivate attempts to understand mother’s emotions as a self-soothing strategy, whereas sympathy-related attempts to understand may be motivated by altruism. PMID:24650197
Children's empathy responses and their understanding of mother's emotions.
Tully, Erin C; Donohue, Meghan Rose; Garcia, Sarah E
2015-01-01
This study investigated children's empathic responses to their mother's distress to provide insight about child factors that contribute to parental socialisation of emotions. Four- to six-year-old children (N = 82) observed their mother's sadness and anger during a simulated emotional phone conversation. Children's facial negative affect was rated and their heart rate variability (HRV) was recorded during the conversation, and their emotion understanding of the conversation was measured through their use of negative emotion words and perspective-taking themes (i.e., discussing the causes or resolution of mother's emotions) in narrative accounts of the conversation. There were positive quadratic relationships between HRV and ratings of facial affect, narrative references to mother's negative emotions and perspective-taking themes. High and low HRV was associated with high facial negative affect, suggesting well-regulated sympathy and poorly regulated personal distress empathic responses, respectively. Moderate HRV was associated with low facial negative affect, suggesting minimal empathic engagement. High and low HRV were associated with the highest probabilities of both emotion understanding indicators, suggesting both sympathy and personal distress responses to mother's distress facilitate understanding of mother's emotions. Personal distress may motivate attempts to understand mother's emotions as a self-soothing strategy, whereas sympathy-related attempts to understand may be motivated by altruism.
Cultural Differences in Affect Intensity Perception in the Context of Advertising
Pogosyan, Marianna; Engelmann, Jan B.
2011-01-01
Cultural differences in the perception of positive affect intensity within an advertising context were investigated among American, Japanese, and Russian participants. Participants were asked to rate the intensity of facial expressions of positive emotions, which displayed either subtle, low intensity, or salient, high intensity expressions of positive affect. In agreement with previous findings from cross-cultural psychological research, current results demonstrate both cross-cultural agreement and differences in the perception of positive affect intensity across the three cultures. Specifically, American participants perceived high arousal (HA) images as significantly less calm than participants from the other two cultures, while the Japanese participants perceived low arousal (LA) images as significantly more excited than participants from the other cultures. The underlying mechanisms of these cultural differences were further investigated through difference scores that probed for cultural differences in perception and categorization of positive emotions. Findings indicate that rating differences are due to (1) perceptual differences in the extent to which HA images were discriminated from LA images, and (2) categorization differences in the extent to which facial expressions were grouped into affect intensity categories. Specifically, American participants revealed significantly higher perceptual differentiation between arousal levels of facial expressions in high and intermediate intensity categories. Japanese participants, on the other hand, did not discriminate between high and low arousal affect categories to the same extent as did the American and Russian participants. These findings indicate the presence of cultural differences in underlying decoding mechanisms of facial expressions of positive affect intensity. Implications of these results for global advertising are discussed. PMID:22084635
Cultural differences in affect intensity perception in the context of advertising.
Pogosyan, Marianna; Engelmann, Jan B
2011-01-01
Cultural differences in the perception of positive affect intensity within an advertising context were investigated among American, Japanese, and Russian participants. Participants were asked to rate the intensity of facial expressions of positive emotions, which displayed either subtle, low intensity, or salient, high intensity expressions of positive affect. In agreement with previous findings from cross-cultural psychological research, current results demonstrate both cross-cultural agreement and differences in the perception of positive affect intensity across the three cultures. Specifically, American participants perceived high arousal (HA) images as significantly less calm than participants from the other two cultures, while the Japanese participants perceived low arousal (LA) images as significantly more excited than participants from the other cultures. The underlying mechanisms of these cultural differences were further investigated through difference scores that probed for cultural differences in perception and categorization of positive emotions. Findings indicate that rating differences are due to (1) perceptual differences in the extent to which HA images were discriminated from LA images, and (2) categorization differences in the extent to which facial expressions were grouped into affect intensity categories. Specifically, American participants revealed significantly higher perceptual differentiation between arousal levels of facial expressions in high and intermediate intensity categories. Japanese participants, on the other hand, did not discriminate between high and low arousal affect categories to the same extent as did the American and Russian participants. These findings indicate the presence of cultural differences in underlying decoding mechanisms of facial expressions of positive affect intensity. Implications of these results for global advertising are discussed.
Rezlescu, Constantin; Duchaine, Brad; Olivola, Christopher Y; Chater, Nick
2012-01-01
Many human interactions are built on trust, so widespread confidence in first impressions generally favors individuals with trustworthy-looking appearances. However, few studies have explicitly examined: 1) the contribution of unfakeable facial features to trust-based decisions, and 2) how these cues are integrated with information about past behavior. Using highly controlled stimuli and an improved experimental procedure, we show that unfakeable facial features associated with the appearance of trustworthiness attract higher investments in trust games. The facial trustworthiness premium is large for decisions based solely on faces, with trustworthy identities attracting 42% more money (Study 1), and remains significant though reduced to 6% when reputational information is also available (Study 2). The face trustworthiness premium persists with real (rather than virtual) currency and when higher payoffs are at stake (Study 3). Our results demonstrate that cooperation may be affected not only by controllable appearance cues (e.g., clothing, facial expressions) as shown previously, but also by features that are impossible to mimic (e.g., individual facial structure). This unfakeable face trustworthiness effect is not limited to the rare situations where people lack any information about their partners, but survives in richer environments where relevant details about partner past behavior are available.
Rezlescu, Constantin; Duchaine, Brad; Olivola, Christopher Y.; Chater, Nick
2012-01-01
Background Many human interactions are built on trust, so widespread confidence in first impressions generally favors individuals with trustworthy-looking appearances. However, few studies have explicitly examined: 1) the contribution of unfakeable facial features to trust-based decisions, and 2) how these cues are integrated with information about past behavior. Methodology/Principal Findings Using highly controlled stimuli and an improved experimental procedure, we show that unfakeable facial features associated with the appearance of trustworthiness attract higher investments in trust games. The facial trustworthiness premium is large for decisions based solely on faces, with trustworthy identities attracting 42% more money (Study 1), and remains significant though reduced to 6% when reputational information is also available (Study 2). The face trustworthiness premium persists with real (rather than virtual) currency and when higher payoffs are at stake (Study 3). Conclusions/Significance Our results demonstrate that cooperation may be affected not only by controllable appearance cues (e.g., clothing, facial expressions) as shown previously, but also by features that are impossible to mimic (e.g., individual facial structure). This unfakeable face trustworthiness effect is not limited to the rare situations where people lack any information about their partners, but survives in richer environments where relevant details about partner past behavior are available. PMID:22470553
The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants
ERIC Educational Resources Information Center
Schwarzer, Gudrun; Jovanovic, Bianca
2010-01-01
In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…
From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome
ERIC Educational Resources Information Center
Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques
2009-01-01
Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…
The perceptual saliency of fearful eyes and smiles: A signal detection study
Saban, Muhammet Ikbal; Rotshtein, Pia
2017-01-01
Facial features differ in the amount of expressive information they convey. Specifically, eyes are argued to be essential for fear recognition, while smiles are crucial for recognising happy expressions. In three experiments, we tested whether expression modulates the perceptual saliency of diagnostic facial features and whether the feature’s saliency depends on the face configuration. Participants were presented with masked facial features or noise at perceptual conscious threshold. The task was to indicate whether eyes (experiments 1-3A) or a mouth (experiment 3B) was present. The expression of the face and its configuration (i.e. spatial arrangement of the features) were manipulated. Experiment 1 compared fearful with neutral expressions, experiments 2 and 3 compared fearful versus happy expressions. The detection accuracy data was analysed using Signal Detection Theory (SDT), to examine the effects of expression and configuration on perceptual precision (d’) and response bias (c), separately. Across all three experiments, fearful eyes were detected better (higher d’) than neutral and happy eyes. Eyes were more precisely detected than mouths, whereas smiles were detected better than fearful mouths. The configuration of the features had no consistent effects across the experiments on the ability to detect expressive features. But facial configuration affected consistently the response bias. Participants used a more liberal criterion for detecting the eyes in canonical configuration and fearful expression. Finally, the power in low spatial frequency of a feature predicted its discriminability index. The results suggest that expressive features are perceptually more salient with a higher d’ due to changes at the low-level visual properties, with emotions and configuration affecting perception through top-down processes, as reflected by the response bias. PMID:28267761
Age Deficits in Facial Affect Recognition: The Influence of Dynamic Cues.
Grainger, Sarah A; Henry, Julie D; Phillips, Louise H; Vanman, Eric J; Allen, Roy
2017-07-01
Older adults have difficulties in identifying most facial expressions of emotion. However, most aging studies have presented static photographs of intense expressions, whereas in everyday experience people see emotions that develop and change. The present study was designed to assess whether age-related difficulties with emotion recognition are reduced when more ecologically valid (i.e., dynamic) stimuli are used. We examined the effect of stimuli format (i.e., static vs. dynamic) on facial affect recognition in two separate studies that included independent samples and distinct stimuli sets. In addition to younger and older participants, a middle-aged group was included in Study 1 and eye gaze patterns were assessed in Study 2. Across both studies, older adults performed worse than younger adults on measures of facial affect recognition. In Study 1, older and-middle aged adults benefited from dynamic stimuli, but only when the emotional displays were subtle. Younger adults gazed more at the eye region of the face relative to older adults (Study 2), but dynamic presentation increased attention towards the eye region for younger adults only. Together, these studies provide important and novel insights into the specific circumstances in which older adults may be expected to experience difficulties in perceiving facial emotions. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Mossaheb, Nilufar; Kaufmann, Rainer M; Schlögelhofer, Monika; Aninilkumparambil, Thushara; Himmelbauer, Claudia; Gold, Anna; Zehetmayer, Sonja; Hoffmann, Holger; Traue, Harald C; Aschauer, Harald
2018-01-01
Social interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls. Standardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smell identification [University of Pennsylvania Smell Identification Test (UPSIT)] in 51 patients with schizophrenia spectrum disorders and 79 healthy controls; furthermore, working memory functions and clinical variables were assessed. In both the univariate and the multivariate results, illness showed a significant influence on UPSIT and FEEL. The inclusion of age and working memory in the MANOVA resulted in a differential effect with sex and working memory as remaining significant factors. Duration of illness was correlated with both emotion recognition and smell identification in men only, whereas immediate general psychopathology and negative symptoms were associated with emotion recognition only in women. Being affected by schizophrenia spectrum disorder impacts one's ability to correctly recognize facial affects and identify odors. Converging evidence suggests a link between the investigated basic and social cognitive abilities in patients with schizophrenia spectrum disorders with a strong contribution of working memory and differential effects of modulators in women vs. men.
Meyer, Anna H; Woods, Michael G; Manton, David J
2014-03-01
This study was designed to assess the influence that the buccal corridor might have on the frontal facial attractiveness of subjects who had received orthodontic treatment with or without 4 premolar extractions. Posttreatment full-face frontal smiling photographs of 30 premolar extraction and 27 nonextraction patients were evaluated by 20 orthodontists, 20 dentists, and 20 laypeople using a visual analog scale. The ratings were analyzed according to rater group, rater sex, and number of years in practice for orthodontists and dentists to search for any statistically significant differences in the ratings on the basis of treatment groups, subject sex, and buccal corridor widths and areas. Orthodontists and dentists gave higher mean overall frontal facial attractiveness scores than did laypeople. There were no significant differences in how men and women rated the study subjects. The number of years in practice did not affect how the orthodontists rated, but it did affect the ratings of the dentists. Female subjects were consistently rated as significantly more attractive than male subjects. There was no difference in ratings for the extraction and nonextraction subject groups. The buccal corridor widths and areas did not affect the frontal facial attractiveness ratings. If treatment has been carried out with thorough diagnosis and careful planning, neither the choice of extraction or nonextraction treatment, nor the resulting buccal corridor widths or areas appeared to affect the subjects' frontal facial attractiveness. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Adams, Reginald B.; Garrido, Carlos O.; Albohn, Daniel N.; Hess, Ursula; Kleck, Robert E.
2016-01-01
It might seem a reasonable assumption that when we are not actively using our faces to express ourselves (i.e., when we display nonexpressive, or neutral faces), those around us will not be able to read our emotions. Herein, using a variety of expression-related ratings, we examined whether age-related changes in the face can accurately reveal one’s innermost affective dispositions. In each study, we found that expressive ratings of neutral facial displays predicted self-reported positive/negative dispositional affect, but only for elderly women, and only for positive affect. These findings meaningfully replicate and extend earlier work examining age-related emotion cues in the face of elderly women (Malatesta et al., 1987a). We discuss these findings in light of evidence that women are expected to, and do, smile more than men, and that the quality of their smiles predicts their life satisfaction. Although ratings of old male faces did not significantly predict self-reported affective dispositions, the trend was similar to that found for old female faces. A plausible explanation for this gender difference is that in the process of attenuating emotional expressions over their lifetimes, old men reveal less evidence of their total emotional experiences in their faces than do old women. PMID:27445944
Aho-Özhan, Helena E A; Keller, Jürgen; Heimrath, Johanna; Uttner, Ingo; Kassubek, Jan; Birbaumer, Niels; Ludolph, Albert C; Lulé, Dorothée
2016-01-01
Amyotrophic lateral sclerosis (ALS) primarily impairs motor abilities but also affects cognition and emotional processing. We hypothesise that subjective ratings of emotional stimuli depicting social interactions and facial expressions is changed in ALS. It was found that recognition of negative emotions and ability to mentalize other's intentions is reduced. Processing of emotions in faces was investigated. A behavioural test of Ekman faces expressing six basic emotions was presented to 30 ALS patients and 29 age-, gender and education matched healthy controls. Additionally, a subgroup of 15 ALS patients that were able to lie supine in the scanner and 14 matched healthy controls viewed the Ekman faces during functional magnetic resonance imaging (fMRI). Affective state and a number of daily social contacts were measured. ALS patients recognized disgust and fear less accurately than healthy controls. In fMRI, reduced brain activity was seen in areas involved in processing of negative emotions replicating our previous results. During processing of sad faces, increased brain activity was seen in areas associated with social emotions in right inferior frontal gyrus and reduced activity in hippocampus bilaterally. No differences in brain activity were seen for any of the other emotional expressions. Inferior frontal gyrus activity for sad faces was associated with increased amount of social contacts of ALS patients. ALS patients showed decreased brain and behavioural responses in processing of disgust and fear and an altered brain response pattern for sadness. The negative consequences of neurodegenerative processes in the course of ALS might be counteracted by positive emotional activity and positive social interactions.
Palumbo, Letizia; Jellema, Tjeerd
2013-01-01
Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.
Palumbo, Letizia; Jellema, Tjeerd
2013-01-01
Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent’s mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent ‘emotional anticipation’, i.e. the involuntary anticipation of the other’s emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor’s identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect ‘emotional anticipation’ (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding. PMID:23409112
Facial thermal variations: A new marker of emotional arousal.
Kosonogov, Vladimir; De Zorzi, Lucas; Honoré, Jacques; Martínez-Velázquez, Eduardo S; Nandrino, Jean-Louis; Martinez-Selva, José M; Sequeira, Henrique
2017-01-01
Functional infrared thermal imaging (fITI) is considered a promising method to measure emotional autonomic responses through facial cutaneous thermal variations. However, the facial thermal response to emotions still needs to be investigated within the framework of the dimensional approach to emotions. The main aim of this study was to assess how the facial thermal variations index the emotional arousal and valence dimensions of visual stimuli. Twenty-four participants were presented with three groups of standardized emotional pictures (unpleasant, neutral and pleasant) from the International Affective Picture System. Facial temperature was recorded at the nose tip, an important region of interest for facial thermal variations, and compared to electrodermal responses, a robust index of emotional arousal. Both types of responses were also compared to subjective ratings of pictures. An emotional arousal effect was found on the amplitude and latency of thermal responses and on the amplitude and frequency of electrodermal responses. The participants showed greater thermal and dermal responses to emotional than to neutral pictures with no difference between pleasant and unpleasant ones. Thermal responses correlated and the dermal ones tended to correlate with subjective ratings. Finally, in the emotional conditions compared to the neutral one, the frequency of simultaneous thermal and dermal responses increased while both thermal or dermal isolated responses decreased. Overall, this study brings convergent arguments to consider fITI as a promising method reflecting the arousal dimension of emotional stimulation and, consequently, as a credible alternative to the classical recording of electrodermal activity. The present research provides an original way to unveil autonomic implication in emotional processes and opens new perspectives to measure them in touchless conditions.
Ji, Ellen; Weickert, Cynthia Shannon; Lenroot, Rhoshel; Catts, Stanley V; Vercammen, Ans; White, Christopher; Gur, Raquel E; Weickert, Thomas W
2015-06-01
Growing evidence suggests that testosterone may play a role in the pathophysiology of schizophrenia given that testosterone has been linked to cognition and negative symptoms in schizophrenia. Here, we determine the extent to which serum testosterone levels are related to neural activity in affective processing circuitry in men with schizophrenia. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as 32 healthy controls and 26 people with schizophrenia performed a facial emotion identification task. Whole brain analyses were performed to determine regions of differential activity between groups during processing of angry versus non-threatening faces. A follow-up ROI analysis using a regression model in a subset of 16 healthy men and 16 men with schizophrenia was used to determine the extent to which serum testosterone levels were related to neural activity. Healthy controls displayed significantly greater activation than people with schizophrenia in the left inferior frontal gyrus (IFG). There was no significant difference in circulating testosterone levels between healthy men and men with schizophrenia. Regression analyses between activation in the IFG and circulating testosterone levels revealed a significant positive correlation in men with schizophrenia (r=.63, p=.01) and no significant relationship in healthy men. This study provides the first evidence that circulating serum testosterone levels are related to IFG activation during emotion face processing in men with schizophrenia but not in healthy men, which suggests that testosterone levels modulate neural processes relevant to facial emotion processing that may interfere with social functioning in men with schizophrenia. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Language and affective facial expression in children with perinatal stroke.
Lai, Philip T; Reilly, Judy S
2015-08-01
Children with perinatal stroke (PS) provide a unique opportunity to understand developing brain-behavior relations. Previous research has noted distinctive differences in behavioral sequelae between children with PS and adults with acquired stroke: children fare better, presumably due to the plasticity of the developing brain for adaptive reorganization. Whereas we are beginning to understand language development, we know little about another communicative domain, emotional expression. The current study investigates the use and integration of language and facial expression during an interview. As anticipated, the language performance of the five and six year old PS group is comparable to their typically developing (TD) peers, however, their affective profiles are distinctive: those with right hemisphere injury are less expressive with respect to affective language and affective facial expression than either those with left hemisphere injury or TD group. The two distinctive profiles for language and emotional expression in these children suggest gradients of neuroplasticity in the developing brain. Copyright © 2015 Elsevier Inc. All rights reserved.
Affective responsiveness, betrayal, and childhood abuse.
Reichmann-Decker, Aimee; DePrince, Anne P; McIntosh, Daniel N
2009-01-01
Several trauma-specific and emotion theories suggest that alterations in children's typical affective responses may serve an attachment function in the context of abuse by a caregiver or close other. For example, inhibiting negative emotional responses or expressions might help the child preserve a relationship with an abusive caregiver. Past research in this area has relied on self-report methods to discover links between affective responsiveness and caregiver abuse. Extending this literature, the current study used facial electromyography to assess affective responsiveness with 2 measures: mimicry of emotional facial expressions and affective modulation of startle. We predicted that women who reported childhood abuse by close others would show alterations in affective responsiveness relative to their peers. We tested 100 undergraduate women who reported histories of (a) childhood sexual or physical abuse by someone close, such as a parent (high-betrayal); (b) childhood abuse by someone not close (low-betrayal); or (c) no abuse in childhood (no-abuse). Especially when viewing women's emotional expressions, the high-betrayal group showed more mimicry of happy and less mimicry of angry faces relative to women who reported no- or low-betrayal abuse, who showed the opposite pattern. Furthermore, women who reported high-betrayal abuse showed less affective modulation of startle during pictures depicting men threatening women than did the other two groups. Findings suggest that, as predicted by betrayal trauma theory, women who have experienced high-betrayal abuse show alterations in automatic emotional processes consistent with caregiving-maintenance goals in an abusive environment.
Celik, Onur; Eskiizmir, Gorkem; Pabuscu, Yuksel; Ulkumen, Burak; Toker, Gokce Tanyeri
The exact etiology of Bell's palsy still remains obscure. The only authenticated finding is inflammation and edema of the facial nerve leading to entrapment inside the facial canal. To identify if there is any relationship between the grade of Bell's palsy and diameter of the facial canal, and also to study any possible anatomic predisposition of facial canal for Bell's palsy including parts which have not been studied before. Medical records and temporal computed tomography scans of 34 patients with Bell's palsy were utilized in this retrospective clinical study. Diameters of both facial canals (affected and unaffected) of each patient were measured at labyrinthine segment, geniculate ganglion, tympanic segment, second genu, mastoid segment and stylomastoid foramen. The House-Brackmann (HB) scale of each patient at presentation and 3 months after the treatment was evaluated from their medical records. The paired samples t-test and Wilcoxon signed-rank test were used for comparison of width between the affected side and unaffected side. The Wilcoxon signed-rank test was also used for evaluation of relationship between the diameter of facial canal and the grade of the Bell's palsy. Significant differences were established at a level of p=0.05 (IBM SPSS Statistics for Windows, Version 21.0.; Armonk, NY, IBM Corp). Thirty-four patients - 16 females, 18 males; mean age±Standard Deviation, 40.3±21.3 - with Bell's palsy were included in the study. According to the HB facial nerve grading system; 8 patients were grade V, 6 were grade IV, 11 were grade III, 8 were grade II and 1 patient was grade I. The mean width at the labyrinthine segment of the facial canal in the affected temporal bone was significantly smaller than the equivalent in the unaffected temporal bone (p=0.00). There was no significant difference between the affected and unaffected temporal bones at the geniculate ganglion (p=0.87), tympanic segment (p=0.66), second genu (p=0.62), mastoid segment (p=0.67) and stylomastoid foramen (p=0.16). We did not find any relationship between the HB grade and the facial canal diameter at the level of labyrinthine segment (p=0.41), tympanic segment (p=0.12), mastoid segment (p=0.14), geniculate ganglion (p=0.13) and stylomastoid foramen (p=0.44), while we found significant relationship at the level of second genu (p=0.02). We found the diameter of labyrinthine segment of facial canal as an anatomic risk factor for Bell's palsy. We also found significant relationship between the HB grade and FC diameter at the level of second genu. Future studies (MRI-CT combined or 3D modeling) are needed to promote this possible relevance especially at second genu. Thus, in the future it may be possible to selectively decompress particular segments in high grade BP patients. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Wang, Xu; Song, Yiying; Zhen, Zonglei; Liu, Jia
2016-05-01
Face perception is essential for daily and social activities. Neuroimaging studies have revealed a distributed face network (FN) consisting of multiple regions that exhibit preferential responses to invariant or changeable facial information. However, our understanding about how these regions work collaboratively to facilitate facial information processing is limited. Here, we focused on changeable facial information processing, and investigated how the functional integration of the FN is related to the performance of facial expression recognition. To do so, we first defined the FN as voxels that responded more strongly to faces than objects, and then used a voxel-based global brain connectivity method based on resting-state fMRI to characterize the within-network connectivity (WNC) of each voxel in the FN. By relating the WNC and performance in the "Reading the Mind in the Eyes" Test across participants, we found that individuals with stronger WNC in the right posterior superior temporal sulcus (rpSTS) were better at recognizing facial expressions. Further, the resting-state functional connectivity (FC) between the rpSTS and right occipital face area (rOFA), early visual cortex (EVC), and bilateral STS were positively correlated with the ability of facial expression recognition, and the FCs of EVC-pSTS and OFA-pSTS contributed independently to facial expression recognition. In short, our study highlights the behavioral significance of intrinsic functional integration of the FN in facial expression processing, and provides evidence for the hub-like role of the rpSTS for facial expression recognition. Hum Brain Mapp 37:1930-1940, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Yetiser, Sertac
2018-06-08
Three patients with large intratemporal facial schwannomas underwent tumor removal and facial nerve reconstruction with hypoglossal anastomosis. The surgical strategy for the cases was tailored to the location of the mass and its extension along the facial nerve. To provide data on the different clinical aspects of facial nerve schwannoma, the appropriate planning for management, and the predictive outcomes of facial function. Three patients with facial schwannomas (two men and one woman, ages 45, 36, and 52 years, respectively) who presented to the clinic between 2009 and 2015 were reviewed. They all had hearing loss but normal facial function. All patients were operated on with radical tumor removal via mastoidectomy and subtotal petrosectomy and simultaneous cranial nerve (CN) 7- CN 12 anastomosis. Multiple segments of the facial nerve were involved ranging in size from 3 to 7 cm. In the follow-up period of 9 to 24 months, there was no tumor recurrence. Facial function was scored House-Brackmann grades II and III, but two patients are still in the process of functional recovery. Conservative treatment with sparing of the nerve is considered in patients with small tumors. Excision of a large facial schwannoma with immediate hypoglossal nerve grafting as a primary procedure can provide satisfactory facial nerve function. One of the disadvantages of performing anastomosis is that there is not enough neural tissue just before the bifurcation of the main stump to provide neural suturing without tension because middle fossa extension of the facial schwannoma frequently involves the main facial nerve at the stylomastoid foramen. Reanimation should be processed with extensive backward mobilization of the hypoglossal nerve. Georg Thieme Verlag KG Stuttgart · New York.
Daly, Eileen M; Deeley, Quinton; Ecker, Christine; Craig, Michael; Hallahan, Brian; Murphy, Clodagh; Johnston, Patrick; Spain, Debbie; Gillan, Nicola; Brammer, Michael; Giampietro, Vincent; Lamar, Melissa; Page, Lisa; Toal, Fiona; Cleare, Anthony; Surguladze, Simon; Murphy, Declan G M
2012-10-01
People with autism spectrum disorders (ASDs) have lifelong deficits in social behavior and differences in behavioral as well as neural responses to facial expressions of emotion. The biological basis to this is incompletely understood, but it may include differences in the role of neurotransmitters such as serotonin, which modulate facial emotion processing in health. While some individuals with ASD have significant differences in the serotonin system, to our knowledge, no one has investigated its role during facial emotion processing in adults with ASD and control subjects using acute tryptophan depletion (ATD) and functional magnetic resonance imaging. To compare the effects of ATD on brain responses to primary facial expressions of emotion in men with ASD and healthy control subjects. Double-blind, placebo-controlled, crossover trial of ATD and functional magnetic resonance imaging to measure brain activity during incidental processing of disgust, fearful, happy, and sad facial expressions. Institute of Psychiatry, King's College London, and South London and Maudsley National Health Service Foundation Trust, England. Fourteen men of normal intelligence with autism and 14 control subjects who did not significantly differ in sex, age, or overall intelligence. Blood oxygenation level-dependent response to facial expressions of emotion. Brain activation was differentially modulated by ATD depending on diagnostic group and emotion type within regions of the social brain network. For example, processing of disgust faces was associated with interactions in medial frontal and lingual gyri, whereas processing of happy faces was associated with interactions in middle frontal gyrus and putamen. Modulation of the processing of facial expressions of emotion by serotonin significantly differs in people with ASD compared with control subjects. The differences vary with emotion type and occur in social brain regions that have been shown to be associated with group differences in serotonin synthesis/receptor or transporter density.
Razi's description and treatment of facial paralysis.
Tabatabaei, Seyed Mahmood; Kalantar Hormozi, Abdoljalil; Asadi, Mohsen
2011-01-01
In the modern medical era, facial paralysis is linked with the name of Charles Bell. This disease, which is usually unilateral and is a peripheral facial palsy, causes facial muscle weakness in the affected side. Bell gave a complete description of the disease; but historically other physicians had described it several hundred years prior although it had been ignored for different reasons, such as the difficulty of the original text language. The first and the most famous of these physicians who described this disease was Mohammad Ibn Zakaryya Razi (Rhazes). In this article, we discuss his opinion.
Facial affect recognition deficit as a marker of genetic vulnerability to schizophrenia.
Alfimova, Margarita V; Abramova, Lilia I; Barhatova, Aleksandra I; Yumatova, Polina E; Lyachenko, Galina L; Golimbet, Vera E
2009-05-01
The aim of this study was to investigate the possibility that affect recognition impairments are associated with genetic liability to schizophrenia. In a group of 55 unaffected relatives of schizophrenia patients (parents and siblings) we examined the capacity to detect facially expressed emotions and its relationship to schizotypal personality, neurocognitive functioning, and the subject's actual emotional state. The relatives were compared with 103 schizophrenia patients and 99 healthy subjects without any family history of psychoses. Emotional stimuli were nine black-and-white photos of actors, who portrayed six basic emotions as well as interest, contempt, and shame. The results evidenced the affect recognition deficit in relatives, though milder than that in patients themselves. No correlation between the deficit and schizotypal personality measured with SPQ was detected in the group of relatives. Neither cognitive functioning, including attention, verbal memory and linguistic ability, nor actual emotional states accounted for their affect recognition impairments. The results suggest that the facial affect recognition deficit in schizophrenia may be related to genetic predisposition to the disorder and may serve as an endophenotype in molecular-genetic studies.
The aging African-American face.
Brissett, Anthony E; Naylor, Michelle C
2010-05-01
With the desire to create a more youthful appearance, patients of all races and ethnicities are increasingly seeking nonsurgical and surgical rejuvenation. In particular, facial rejuvenation procedures have grown significantly within the African-American population. This increase has resulted in a paradigm shift in facial plastic surgery as one considers rejuvenation procedures in those of African descent, as the aging process of various racial groups differs from traditional models. The purpose of this article is to draw attention to the facial features unique to those of African descent and the role these features play in the aging process, taking care to highlight the differences from traditional models of facial aging. In addition, this article will briefly describe the nonsurgical and surgical options for facial rejuvenation taking into consideration the previously discussed facial aging differences and postoperative considerations. Thieme Medical Publishers.
Decoding facial blends of emotion: visual field, attentional and hemispheric biases.
Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I
2013-12-01
Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.
Discrimination of emotional facial expressions by tufted capuchin monkeys (Sapajus apella).
Calcutt, Sarah E; Rubin, Taylor L; Pokorny, Jennifer J; de Waal, Frans B M
2017-02-01
Tufted or brown capuchin monkeys (Sapajus apella) have been shown to recognize conspecific faces as well as categorize them according to group membership. Little is known, though, about their capacity to differentiate between emotionally charged facial expressions or whether facial expressions are processed as a collection of features or configurally (i.e., as a whole). In 3 experiments, we examined whether tufted capuchins (a) differentiate photographs of neutral faces from either affiliative or agonistic expressions, (b) use relevant facial features to make such choices or view the expression as a whole, and (c) demonstrate an inversion effect for facial expressions suggestive of configural processing. Using an oddity paradigm presented on a computer touchscreen, we collected data from 9 adult and subadult monkeys. Subjects discriminated between emotional and neutral expressions with an exceptionally high success rate, including differentiating open-mouth threats from neutral expressions even when the latter contained varying degrees of visible teeth and mouth opening. They also showed an inversion effect for facial expressions, results that may indicate that quickly recognizing expressions does not originate solely from feature-based processing but likely a combination of relational processes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia.
Palermo, Romina; Willis, Megan L; Rivolta, Davide; McKone, Elinor; Wilson, C Ellie; Calder, Andrew J
2011-04-01
We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and 'social'). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. Copyright © 2011 Elsevier Ltd. All rights reserved.
Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia
Palermo, Romina; Willis, Megan L.; Rivolta, Davide; McKone, Elinor; Wilson, C. Ellie; Calder, Andrew J.
2011-01-01
We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and ‘social’). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. PMID:21333662
[Regeneration and repair of peripheral nerves: clinical implications in facial paralysis surgery].
Hontanilla, B; Vidal, A
2000-01-01
Peripheral nerve lesions are one of the most frequent causes of chronic incapacity. Upper or lower limb palsies due to brachial or lumbar plexus injuries, facial paralysis and nerve lesions caused by systemic diseases are one of the major goals of plastic and reconstructive surgery. However, the poor results obtained in repaired peripheral nerves during the Second World War lead to a pessimist vision of peripheral nerve repair. Nevertheless, a well understanding of microsurgical principles in reconstruction and molecular biology of nerve regeneration have improved the clinical results. Thus, although the results obtained are quite far from perfect, these procedures give to patients a hope in the recuperation of their lesions and then on function. Technical aspects in nerve repair are well established; the next step is to manipulate the biology. In this article we will comment the biological processes which appear in peripheral nerve regeneration, we will establish the main concepts on peripheral nerve repair applied in facial paralysis cases and, finally, we will proportionate some ideas about how clinical practice could be affected by manipulation of the peripheral nerve biology.
Dawel, Amy; O'Kearney, Richard; McKone, Elinor; Palermo, Romina
2012-11-01
The present meta-analysis aimed to clarify whether deficits in emotion recognition in psychopathy are restricted to certain emotions and modalities or whether they are more pervasive. We also attempted to assess the influence of other important variables: age, and the affective factor of psychopathy. A systematic search of electronic databases and a subsequent manual search identified 26 studies that included 29 experiments (N = 1376) involving six emotion categories (anger, disgust, fear, happiness, sadness, surprise) across three modalities (facial, vocal, postural). Meta-analyses found evidence of pervasive impairments across modalities (facial and vocal) with significant deficits evident for several emotions (i.e., not only fear and sadness) in both adults and children/adolescents. These results are consistent with recent theorizing that the amygdala, which is believed to be dysfunctional in psychopathy, has a broad role in emotion processing. We discuss limitations of the available data that restrict the ability of meta-analysis to consider the influence of age and separate the sub-factors of psychopathy, highlighting important directions for future research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Effects of delta-9-tetrahydrocannabinol on evaluation of emotional images
Ballard, Michael E; Bedi, Gillinder; de Wit, Harriet
2013-01-01
There is growing evidence that drugs of abuse alter processing of emotional information in ways that could be attractive to users. Our recent report that Δ9-tetrahydrocannabinol (THC) diminishes amygdalar activation in response to threat-related faces suggests that THC may modify evaluation of emotionally-salient, particularly negative or threatening, stimuli. In this study, we examined the effects of acute THC on evaluation of emotional images. Healthy volunteers received two doses of THC (7.5 and 15 mg; p.o.) and placebo across separate sessions before performing tasks assessing facial emotion recognition and emotional responses to pictures of emotional scenes. THC significantly impaired recognition of facial fear and anger, but it only marginally impaired recognition of sadness and happiness. The drug did not consistently affect ratings of emotional scenes. THC' effects on emotional evaluation were not clearly related to its mood-altering effects. These results support our previous work, and show that THC reduces perception of facial threat. Nevertheless, THC does not appear to positively bias evaluation of emotional stimuli in general PMID:22585232
Bowen, Erica; Dixon, Louise
2010-01-01
This study examined the concurrent and prospective associations between children's ability to accurately recognize facial affect at age 8.5 and antisocial behavior at age 8.5 and 10.5 years in a sub sample of the Avon Longitudinal Study of Parents and Children cohort (5,396 children; 2,644, 49% males). All observed effects were small. It was found that at age 8.5 years, in contrast to nonantisocial children; antisocial children were less accurate at decoding happy and sad expressions when presented at low intensity. In addition, concurrent antisocial behavior was associated with misidentifying expressions of fear as expressions of sadness. In longitudinal analyses, children who misidentified fear as anger exhibited a decreased risk of antisocial behavior 2 years later. The study suggests that concurrent rather than future antisocial behavior is associated with facial affect recognition accuracy. (c) 2010 Wiley-Liss, Inc.
Affective priming using facial expressions modulates liking for abstract art.
Flexas, Albert; Rosselló, Jaume; Christensen, Julia F; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric
2013-01-01
We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms) and extended (SOA = 300 ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.
Affective Priming Using Facial Expressions Modulates Liking for Abstract Art
Flexas, Albert; Rosselló, Jaume; Christensen, Julia F.; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric
2013-01-01
We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20ms) and extended (SOA = 300ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation. PMID:24260350
Colasante, Tyler; Mossad, Sarah I; Dudek, Joanna; Haley, David W
2017-04-01
Understanding the relative and joint prioritization of age- and valence-related face characteristics in adults' cortical face processing remains elusive because these two characteristics have not been manipulated in a single study of neural face processing. We used electroencephalography to investigate adults' P1, N170, P2 and LPP responses to infant and adult faces with happy and sad facial expressions. Viewing infant vs adult faces was associated with significantly larger P1, N170, P2 and LPP responses, with hemisphere and/or participant gender moderating this effect in select cases. Sad faces were associated with significantly larger N170 responses than happy faces. Sad infant faces were associated with significantly larger N170 responses in the right hemisphere than all other combinations of face age and face valence characteristics. We discuss the relative and joint neural prioritization of infant face characteristics and negative facial affect, and their biological value as distinct caregiving and social cues. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Bagchi, Gautam; Nath, Dilip Kumar
2012-01-01
Permanent facial paralysis can be devastating for a patient. Modern society's emphasis on appearance and physical beauty contributes to this problem and often leads to isolation of patients embarrassed by their appearance. Lagophthalmos with ocular exposure, loss of oral competence with resultant drooling, alar collapse with nasal airway obstruction, and difficulties with mastication and speech production are all potential consequences of facial paralysis. Affected patients are confronted with both a cosmetic defect and the functional deficits associated with loss of facial nerve function. In this case history report, a modified maxillary complete denture permitted a patient with Bell palsy to carry on daily activities with minimal facial distortion, pain, speech difficulty, and associated emotional trauma.
Bell’s palsy: data from a study of 70 cases
Cirpaciu, D; Goanta, CM
2014-01-01
Bell’s palsy is a condition that affects the facial nerve, which is one of the twelve cranial nerves. Its main function is to control all the muscles of the facial expression. It is a unilateral, acute, partial or complete paralysis of the facial nerve. Bell's palsy remains the most common cause of facial nerve paralysis, more often encountered in females aged 17 to 30 years, recurrent in many cases and with poor associations with other pathologic conditions. In modern literature, the suspected etiology could be due to the reactivation of the latent herpes viral infections in the geniculate ganglia, and their subsequent migration to the facial nerve but, favorable outcome by using vasodilators, neurotrophic and corticosteroid therapy was recorded. PMID:25870668
Bell's palsy: data from a study of 70 cases.
Cirpaciu, D; Goanta, C M
2014-01-01
Bell's palsy is a condition that affects the facial nerve, which is one of the twelve cranial nerves. Its main function is to control all the muscles of the facial expression. It is a unilateral, acute, partial or complete paralysis of the facial nerve. Bell's palsy remains the most common cause of facial nerve paralysis, more often encountered in females aged 17 to 30 years, recurrent in many cases and with poor associations with other pathologic conditions. In modern literature, the suspected etiology could be due to the reactivation of the latent herpes viral infections in the geniculate ganglia, and their subsequent migration to the facial nerve but, favorable outcome by using vasodilators, neurotrophic and corticosteroid therapy was recorded.
Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention
Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang
2016-01-01
Past research has proven human’s extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention. PMID:27848992
Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention.
Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang
2016-11-16
Past research has proven human's extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention.
Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.
Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi
2012-12-01
We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Physical therapy for facial paralysis: a tailored treatment approach.
Brach, J S; VanSwearingen, J M
1999-04-01
Bell palsy is an acute facial paralysis of unknown etiology. Although recovery from Bell palsy is expected without intervention, clinical experience suggests that recovery is often incomplete. This case report describes a classification system used to guide treatment and to monitor recovery of an individual with facial paralysis. The patient was a 71-year-old woman with complete left facial paralysis secondary to Bell palsy. Signs and symptoms were assessed using a standardized measure of facial impairment (Facial Grading System [FGS]) and questions regarding functional limitations. A treatment-based category was assigned based on signs and symptoms. Rehabilitation involved muscle re-education exercises tailored to the treatment-based category. In 14 physical therapy sessions over 13 months, the patient had improved facial impairments (initial FGS score= 17/100, final FGS score= 68/100) and no reported functional limitations. Recovery from Bell palsy can be a complicated and lengthy process. The use of a classification system may help simplify the rehabilitation process.
The effects of social anxiety on emotional face discrimination and its modulation by mouth salience.
du Rocher, Andrew R; Pickering, Alan D
2018-05-21
People high in social anxiety experience fear of social situations due to the likelihood of social evaluation. Whereas happy faces are generally processed very quickly, this effect is impaired by high social anxiety. Mouth regions are implicated during emotional face processing, therefore differences in mouth salience might affect how social anxiety relates to emotional face discrimination. We designed an emotional facial expression recognition task to reveal how varying levels of sub-clinical social anxiety (measured by questionnaire) related to the discrimination of happy and fearful faces, and of happy and angry faces. We also categorised the facial expressions by the salience of the mouth region (i.e. high [open mouth] vs. low [closed mouth]). In a sample of 90 participants higher social anxiety (relative to lower social anxiety) was associated with a reduced happy face reaction time advantage. However, this effect was mainly driven by the faces with less salient closed mouths. Our results are consistent with theories of anxiety that incorporate an oversensitive valence evaluation system.
Computer-aided psychotherapy based on multimodal elicitation, estimation and regulation of emotion.
Cosić, Krešimir; Popović, Siniša; Horvat, Marko; Kukolja, Davor; Dropuljić, Branimir; Kovač, Bernard; Jakovljević, Miro
2013-09-01
Contemporary psychiatry is looking at affective sciences to understand human behavior, cognition and the mind in health and disease. Since it has been recognized that emotions have a pivotal role for the human mind, an ever increasing number of laboratories and research centers are interested in affective sciences, affective neuroscience, affective psychology and affective psychopathology. Therefore, this paper presents multidisciplinary research results of Laboratory for Interactive Simulation System at Faculty of Electrical Engineering and Computing, University of Zagreb in the stress resilience. Patient's distortion in emotional processing of multimodal input stimuli is predominantly consequence of his/her cognitive deficit which is result of their individual mental health disorders. These emotional distortions in patient's multimodal physiological, facial, acoustic, and linguistic features related to presented stimulation can be used as indicator of patient's mental illness. Real-time processing and analysis of patient's multimodal response related to annotated input stimuli is based on appropriate machine learning methods from computer science. Comprehensive longitudinal multimodal analysis of patient's emotion, mood, feelings, attention, motivation, decision-making, and working memory in synchronization with multimodal stimuli provides extremely valuable big database for data mining, machine learning and machine reasoning. Presented multimedia stimuli sequence includes personalized images, movies and sounds, as well as semantically congruent narratives. Simultaneously, with stimuli presentation patient provides subjective emotional ratings of presented stimuli in terms of subjective units of discomfort/distress, discrete emotions, or valence and arousal. These subjective emotional ratings of input stimuli and corresponding physiological, speech, and facial output features provides enough information for evaluation of patient's cognitive appraisal deficit. Aggregated real-time visualization of this information provides valuable assistance in patient mental state diagnostics enabling therapist deeper and broader insights into dynamics and progress of the psychotherapy.
Kamboj, Sunjeev K; Joye, Alyssa; Bisby, James A; Das, Ravi K; Platt, Bradley; Curran, H Valerie
2013-05-01
Studies of affect recognition can inform our understanding of the interpersonal effects of alcohol and help develop a more complete neuropsychological profile of this drug. The objective of the study was to examine affect recognition in social drinkers using a novel dynamic affect-recognition task, sampling performance across a range of evolutionarily significant target emotions and neutral expressions. Participants received 0, 0.4 or 0.8 g/kg alcohol in a double-blind, independent groups design. Relatively naturalistic changes in facial expression-from neutral (mouth open) to increasing intensities of target emotions, as well as neutral (mouth closed)-were simulated using computer-generated dynamic morphs. Accuracy and reaction time were measured and a two-high-threshold model applied to hits and false-alarm data to determine sensitivity and response bias. While there was no effect on the principal emotion expressions (happiness, sadness, fear, anger and disgust), compared to those receiving 0.8 g/kg of alcohol and placebo, participants administered with 0.4 g/kg alcohol tended to show an enhanced response bias to neutral expressions. Exploration of this effect suggested an accompanying tendency to misattribute neutrality to sad expressions following the 0.4-g/kg dose. The 0.4-g/kg alcohol-but not 0.8 g/kg-produced a limited and specific modification in affect recognition evidenced by a neutral response bias and possibly an accompanying tendency to misclassify sad expressions as neutral. In light of previous findings on involuntary negative memory following the 0.4-g/kg dose, we suggest that moderate-but not high-doses of alcohol have a special relevance to emotional processing in social drinkers.
Long-term academic stress enhances early processing of facial expressions.
Zhang, Liang; Qin, Shaozheng; Yao, Zhuxi; Zhang, Kan; Wu, Jianhui
2016-11-01
Exposure to long-term stress can lead to a variety of emotional and behavioral problems. Although widely investigated, the neural basis of how long-term stress impacts emotional processing in humans remains largely elusive. Using event-related brain potentials (ERPs), we investigated the effects of long-term stress on the neural dynamics of emotionally facial expression processing. Thirty-nine male college students undergoing preparation for a major examination and twenty-one matched controls performed a gender discrimination task for faces displaying angry, happy, and neutral expressions. The results of the Perceived Stress Scale showed that participants in the stress group perceived higher levels of long-term stress relative to the control group. ERP analyses revealed differential effects of long-term stress on two early stages of facial expression processing: 1) long-term stress generally augmented posterior P1 amplitudes to facial stimuli irrespective of expression valence, suggesting that stress can increase sensitization to visual inputs in general, and 2) long-term stress selectively augmented fronto-central P2 amplitudes for angry but not for neutral or positive facial expressions, suggesting that stress may lead to increased attentional prioritization to processing negative emotional stimuli. Together, our findings suggest that long-term stress has profound impacts on the early stages of facial expression processing, with an increase at the very early stage of general information inputs and a subsequent attentional bias toward processing emotionally negative stimuli. Copyright © 2016 Elsevier B.V. All rights reserved.
Henderson, Heather A.; Newell, Lisa; Jaime, Mark; Mundy, Peter
2015-01-01
Higher-functioning participants with and without autism spectrum disorder (ASD) viewed a series of face stimuli, made decisions regarding the affect of each face, and indicated their confidence in each decision. Confidence significantly predicted accuracy across all participants, but this relation was stronger for participants with typical development than participants with ASD. In the hierarchical linear modeling analysis, there were no differences in face processing accuracy between participants with and without ASD, but participants with ASD were more confident in their decisions. These results suggest that individuals with ASD have metacognitive impairments and are overconfident in face processing. Additionally, greater metacognitive awareness was predictive of better face processing accuracy, suggesting that metacognition may be a pivotal skill to teach in interventions. PMID:26496991
The Influence of Facial Signals on the Automatic Imitation of Hand Actions
Butler, Emily E.; Ward, Robert; Ramsey, Richard
2016-01-01
Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate “in the moment” states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation. PMID:27833573
The Influence of Facial Signals on the Automatic Imitation of Hand Actions.
Butler, Emily E; Ward, Robert; Ramsey, Richard
2016-01-01
Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate "in the moment" states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation.
Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.
Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can
2017-01-01
Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated backward masking paradigm was used to investigate whether individuals with Internet Gaming Disorder (IGD) and normal controls (NC) exhibit different patterns in facial expression processing.• The results demonstrated that individuals with IGD respond differently to facial expressions compared with NC on a preattentive level. Behaviorally, individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results further showed (1) decreased amplitudes in the N170 component (an index of early face processing) in individuals with IGD when they process neutral expressions compared with happy expressions in the happy-neutral expressions context, whereas the NC exhibited comparable N170 amplitudes in response to these two expressions; (2) both the IGD and NC group demonstrated similar N170 amplitudes in response to sad and neutral faces in the sad-neutral expressions context.• The decreased amplitudes of N170 to neutral faces than happy faces in individuals with IGD might due to their less expectancies for neutral content in the happy-neutral expressions context, while individuals with IGD may have no different expectancies for neutral and sad faces in the sad-neutral expressions context.
Spontaneous and posed facial expression in Parkinson's disease.
Smith, M C; Smith, M K; Ellgring, H
1996-09-01
Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n = 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2-5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion rating but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.
The reconstruction of male hair-bearing facial regions.
Ridgway, Emily B; Pribaz, Julian J
2011-01-01
Loss of hair-bearing regions of the face caused by trauma, tumor resection, or burn presents a difficult reconstructive task for plastic surgeons. The ideal tissue substitute should have the same characteristics as the facial area affected, consisting of thin, pliable tissue with a similar color match and hair-bearing quality. This is a retrospective study of 34 male patients who underwent reconstruction of hair-bearing facial regions performed by the senior author (J.J.P.). Local and pedicled flaps were used primarily to reconstruct defects after tumor extirpation, trauma, infections, and burns. Two patients had irradiation before reconstruction. Two patients had prior facial reconstruction with free flaps. The authors found that certain techniques of reconstructing defects in hair-bearing facial regions were more successful than others in particular facial regions and in different sizes of defects. The authors were able to develop a simple algorithm for management of facial defects involving the hair-bearing regions of the eyebrow, sideburn, beard, and mustache that may prospectively aid the planning of reconstructive strategy in these cases.
Pediatric maxillofacial fractures.
Spring, P M; Cote, D N
1996-05-01
Maxillofacial trauma in the pediatric population is a relatively infrequent occurrence. Studies have demonstrated consistently that 5% of all facial fractures occur in children. The low percentage of facial fractures in this age group has been attributed, in part, to the lack of full pneumatization of the sinuses until later in childhood. Review of the literature indicates that boys are more commonly affected than girls and that the majority of pediatric facial fractures occur in children between 6 and 12 years of age. Motor vehicle accidents, falls, and blunt trauma are responsible for the largest number of pediatric facial fractures. The most common site of facial fracture is the nose and dentoalveolan complex, followed by the mandible, orbit, and midface in most pediatric cohorts. Management of the mandible is often conservative owing to the high percentage of isolated condylar fractures in children. Open reduction and internal fixation of pediatric facial fractures is indicated in complex mandible, midface, and orbital fractures. The effect of rigid fixation on facial skeleton growth is not completely understood.
The Face of the Chameleon: The Experience of Facial Mimicry for the Mimicker and the Mimickee
Kulesza, Wojciech Marek; Cisłak, Aleksandra; Vallacher, Robin R.; Nowak, Andrzej; Czekiel, Martyna; Bedynska, Sylwia
2015-01-01
ABSTRACT This research addressed three questions concerning facial mimicry: (a) Does the relationship between mimicry and liking characterize all facial expressions, or is it limited to specific expressions? (b) Is the relationship between facial mimicry and liking symmetrical for the mimicker and the mimickee? (c) Does conscious mimicry have consequences for emotion recognition? A paradigm is introduced in which participants interact over a computer setup with a confederate whose prerecorded facial displays of emotion are synchronized with participants’ behavior to create the illusion of social interaction. In Experiment 1, the confederate did or did not mimic participants’ facial displays of various subsets of basic emotions. Mimicry promoted greater liking for the confederate regardless of which emotions were mimicked. Experiment 2 reversed these roles: participants were instructed to mimic or not to mimic the confederate’s facial displays. Mimicry did not affect liking for the confederate but it did impair emotion recognition. PMID:25811746
Involvement of Right STS in Audio-Visual Integration for Affective Speech Demonstrated Using MEG
Hagan, Cindy C.; Woods, Will; Johnson, Sam; Green, Gary G. R.; Young, Andrew W.
2013-01-01
Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals. PMID:23950977
Olsson, Andreas; Kopsida, Eleni; Sorjonen, Kimmo; Savic, Ivanka
2016-06-01
The abilities to "read" other peoples' intentions and emotions, and to learn from their experiences, are critical to survival. Previous studies have highlighted the role of sex hormones, notably testosterone and estrogen, in these processes. Yet it is unclear how these hormones affect social cognition and emotion using acute hormonal administration. In the present double-blind placebo-controlled study, we administered an acute exogenous dose of testosterone or estrogen to healthy female and male volunteers, respectively, with the aim of investigating the effects of these steroids on social-cognitive and emotional processes. Following hormonal and placebo treatment, participants made (a) facial dominance judgments, (b) mental state inferences (Reading the Mind in the Eyes Test), and (c) learned aversive associations through watching others' emotional responses (observational fear learning [OFL]). Our results showed that testosterone administration to females enhanced ratings of facial dominance but diminished their accuracy in inferring mental states. In men, estrogen administration resulted in an increase in emotional (vicarious) reactivity when watching a distressed other during the OFL task. Taken together, these results suggest that sex hormones affect social-cognitive and emotional functions at several levels, linking our results to neuropsychiatric disorders in which these functions are impaired. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG.
Hagan, Cindy C; Woods, Will; Johnson, Sam; Green, Gary G R; Young, Andrew W
2013-01-01
Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals.
Selective attention modulates high-frequency activity in the face-processing network.
Müsch, Kathrin; Hamamé, Carlos M; Perrone-Bertolotti, Marcela; Minotti, Lorella; Kahane, Philippe; Engel, Andreas K; Lachaux, Jean-Philippe; Schneider, Till R
2014-11-01
Face processing depends on the orchestrated activity of a large-scale neuronal network. Its activity can be modulated by attention as a function of task demands. However, it remains largely unknown whether voluntary, endogenous attention and reflexive, exogenous attention to facial expressions equally affect all regions of the face-processing network, and whether such effects primarily modify the strength of the neuronal response, the latency, the duration, or the spectral characteristics. We exploited the good temporal and spatial resolution of intracranial electroencephalography (iEEG) and recorded from depth electrodes to uncover the fast dynamics of emotional face processing. We investigated frequency-specific responses and event-related potentials (ERP) in the ventral occipito-temporal cortex (VOTC), ventral temporal cortex (VTC), anterior insula, orbitofrontal cortex (OFC), and amygdala when facial expressions were task-relevant or task-irrelevant. All investigated regions of interest (ROI) were clearly modulated by task demands and exhibited stronger changes in stimulus-induced gamma band activity (50-150 Hz) when facial expressions were task-relevant. Observed latencies demonstrate that the activation is temporally coordinated across the network, rather than serially proceeding along a processing hierarchy. Early and sustained responses to task-relevant faces in VOTC and VTC corroborate their role for the core system of face processing, but they also occurred in the anterior insula. Strong attentional modulation in the OFC and amygdala (300 msec) suggests that the extended system of the face-processing network is only recruited if the task demands active face processing. Contrary to our expectation, we rarely observed differences between fearful and neutral faces. Our results demonstrate that activity in the face-processing network is susceptible to the deployment of selective attention. Moreover, we show that endogenous attention operates along the whole face-processing network, and that these effects are reflected in frequency-specific changes in the gamma band. Copyright © 2014 Elsevier Ltd. All rights reserved.
Lewis, Michael B
2017-09-01
Consistent with theories from evolutionary psychology, facial symmetry correlates with attractiveness. Further, the preference for symmetrical faces appears to be affected by fertility in women. One limitation of previous research is that faces are often symmetrically lit front-views and so symmetry can be assessed using 2D pictorial information. Another limitation is that two-alternative-forced-choice (2afc) tasks are often used to assess symmetry preference and these cannot distinguish between differences in preference for symmetry and differences in ability of asymmetry detection. The current study used three tasks to assess the effects of facial symmetry: attractiveness ratings, 2afc preference and asymmetry detection. To break the link between 2D pictorial symmetry and facial symmetry, 3D computer generated heads were used with asymmetrical lighting and yaw rotation. Facial symmetry correlated with attractiveness even under more naturalistic viewing conditions. Path analysis indicates that the link between fertility and 2afc symmetry preference is mediated by asymmetry detection not increased preference for symmetry. The existing literature on symmetry preference and attractiveness is reinterpreted in terms of differences in asymmetry detection. Copyright © 2017 Elsevier B.V. All rights reserved.
What is expected from a facial trauma caused by violence?
Goulart, Douglas Rangel; Colombo, Lucas do Amaral; de Moraes, Márcio; Asprino, Luciana
2014-01-01
The aim of this retrospective study was to compare the peculiarities of maxillofacial injuries caused by interpersonal violence with other etiologic factors. Medical records of 3,724 patients with maxillofacial injuries in São Paulo state (Brazil) were retrospectively analyzed. The data were submitted to statistical analysis (simple descriptive statistics and Chi-squared test) using SPSS 18.0 software. Data of 612 patients with facial injuries caused by violence were analyzed. The majority of the patients were male (81%; n = 496), with a mean age of 31.28 years (standard deviation of 13.33 years). These patients were more affected by mandibular and nose fractures, when compared with all other patients (P < 0.01), although fewer injuries were recorded in other body parts (χ(2) = 17.54; P < 0.01); Victims of interpersonal violence exhibited more injuries when the neurocranium was analyzed in isolation (χ(2) = 6.85; P < 0.01). Facial trauma due to interpersonal violence seem to be related to a higher rate of facial fractures and lacerations when compared to all patients with facial injuries. Prominent areas of the face and neurocranium were more affected by injuries.
Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi
2017-09-01
People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p < 0.05) and no significant changes were found in the rest of the facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p < 0.05). Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.
Mäkitie, A A; Salmi, M; Lindford, A; Tuomi, J; Lassus, P
2016-12-01
Prosthetic mask restoration of the donor face is essential in current facial transplant protocols. The aim was to develop a new three-dimensional (3D) printing (additive manufacturing; AM) process for the production of a donor face mask that fulfilled the requirements for facial restoration after facial harvest. A digital image of a single test person's face was obtained in a standardized setting and subjected to three different image processing techniques. These data were used for the 3D modeling and printing of a donor face mask. The process was also tested in a cadaver setting and ultimately used clinically in a donor patient after facial allograft harvest. and Conclusions: All the three developed and tested techniques enabled the 3D printing of a custom-made face mask in a timely manner that is almost an exact replica of the donor patient's face. This technique was successfully used in a facial allotransplantation donor patient. Copyright © 2016 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Contrasting Specializations for Facial Motion Within the Macaque Face-Processing System
Fisher, Clark; Freiwald, Winrich A.
2014-01-01
SUMMARY Facial motion transmits rich and ethologically vital information [1, 2], but how the brain interprets this complex signal is poorly understood. Facial form is analyzed by anatomically distinct face patches in the macaque brain [3, 4], and facial motion activates these patches and surrounding areas [5, 6]. Yet it is not known whether facial motion is processed by its own distinct and specialized neural machinery, and if so, what that machinery’s organization might be. To address these questions, we used functional magnetic resonance imaging (fMRI) to monitor the brain activity of macaque monkeys while they viewed low- and high-level motion and form stimuli. We found that, beyond classical motion areas and the known face patch system, moving faces recruited a heretofore-unrecognized face patch. Although all face patches displayed distinctive selectivity for face motion over object motion, only two face patches preferred naturally moving faces, while three others preferred randomized, rapidly varying sequences of facial form. This functional divide was anatomically specific, segregating dorsal from ventral face patches, thereby revealing a new organizational principle of the macaque face-processing system. PMID:25578903
Martín-Ruiz, María-Luisa; Máximo-Bocanegra, Nuria; Luna-Oliva, Laura
2016-03-26
The importance of an early rehabilitation process in children with cerebral palsy (CP) is widely recognized. On the one hand, new and useful treatment tools such as rehabilitation systems based on interactive technologies have appeared for rehabilitation of gross motor movements. On the other hand, from the therapeutic point of view, performing rehabilitation exercises with the facial muscles can improve the swallowing process, the facial expression through the management of muscles in the face, and even the speech of children with cerebral palsy. However, it is difficult to find interactive games to improve the detection and evaluation of oral-facial musculature dysfunctions in children with CP. This paper describes a framework based on strategies developed for interactive serious games that is created both for typically developed children and children with disabilities. Four interactive games are the core of a Virtual Environment called SONRIE. This paper demonstrates the benefits of SONRIE to monitor children's oral-facial difficulties. The next steps will focus on the validation of SONRIE to carry out the rehabilitation process of oral-facial musculature in children with cerebral palsy.
Impaired recognition of happy facial expressions in bipolar disorder.
Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M
2014-08-01
The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.
Genetics, the facial plastic and reconstructive surgeon, and the future.
Seidman, M D
2001-01-01
Predicting the future is a daunting task that is typically reserved for visionaries or tarot card readers. Nonetheless, the challenge is set, and this brief essay will predict how genetics and molecular biology may affect diseases in facial plastic and reconstructive surgery.
ERIC Educational Resources Information Center
Journal of College Science Teaching, 2005
2005-01-01
A recent study by Zara Ambadar and Jeffrey F. Cohn of the University of Pittsburgh and Jonathan W. Schooler of the University of British Columbia, examined how motion affects people's judgment of subtle facial expressions. Two experiments demonstrated robust effects of motion in facilitating the perception of subtle facial expressions depicting…
Neural and behavioral associations of manipulated determination facial expressions.
Price, Tom F; Hortensius, Ruud; Harmon-Jones, Eddie
2013-09-01
Past research associated relative left frontal cortical activity with positive affect and approach motivation, or the urge to move toward a stimulus. Less work has examined relative left frontal activity and positive emotions ranging from low to high approach motivation, to test whether positive affects that differ in approach motivational intensity influence relative left frontal cortical activity. Participants in the present experiment adopted determination (high approach positive), satisfaction (low approach positive), or neutral facial expressions while electroencephalographic (EEG) activity was recorded. Next, participants completed a task measuring motivational persistence behavior and then they completed self-report emotion questionnaires. Determination compared to satisfaction and neutral facial expressions caused greater relative left frontal activity relative to baseline EEG recordings. Facial expressions did not directly influence task persistence. However, relative left frontal activity correlated positively with persistence on insolvable tasks in the determination condition. These results extend embodiment theories and motivational interpretations of relative left frontal activity. Published by Elsevier B.V.
Do you remember your sad face? The roles of negative cognitive style and sad mood.
Caudek, Corrado; Monni, Alessandra
2013-01-01
We studied the effects of negative cognitive style, sad mood, and facial affect on the self-face advantage in a sample of 66 healthy individuals (mean age 26.5 years, range 19-47 years). The sample was subdivided into four groups according to inferential style and responsivity to sad mood induction. Following a sad mood induction, we examined the effect on working memory of an incidental association between facial affect, facial identity, and head-pose orientation. Overall, head-pose recognition was more accurate for the self-face than for nonself face (self-face advantage, SFA). However, participants high in negative cognitive style who experienced higher levels of sadness displayed a stronger SFA for sad expressions than happy expressions. The remaining participants displayed an opposite bias (a stronger SFA for happy expressions than sad expressions), or no bias. These findings highlight the importance of trait-vulnerability status in the working memory biases related to emotional facial expressions.
Johnson, Kerri L; McKay, Lawrie S; Pollick, Frank E
2011-05-01
Gender stereotypes have been implicated in sex-typed perceptions of facial emotion. Such interpretations were recently called into question because facial cues of emotion are confounded with sexually dimorphic facial cues. Here we examine the role of visual cues and gender stereotypes in perceptions of biological motion displays, thus overcoming the morphological confounding inherent in facial displays. In four studies, participants' judgments revealed gender stereotyping. Observers accurately perceived emotion from biological motion displays (Study 1), and this affected sex categorizations. Angry displays were overwhelmingly judged to be men; sad displays were judged to be women (Studies 2-4). Moreover, this pattern remained strong when stimuli were equated for velocity (Study 3). We argue that these results were obtained because perceivers applied gender stereotypes of emotion to infer sex category (Study 4). Implications for both vision sciences and social psychology are discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Update on the Ophthalmic Management of Facial Paralysis.
MacIntosh, Peter W; Fay, Aaron M
2018-06-07
Bell palsy is the most common neurologic condition affecting the cranial nerves. Lagophthalmos, exposure keratopathy, and corneal ulceration are potential complications. In this review, we evaluate various causes of facial paralysis as well as the level 1 evidence supporting the use of a short course of oral steroids for idiopathic Bell palsy to improve functional outcomes. Various surgical and nonsurgical techniques are also discussed for the management of residual facial dysfunction. Copyright © 2018. Published by Elsevier Inc.
Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan
2016-06-01
Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.
Social Use of Facial Expressions in Hylobatids
Scheider, Linda; Waller, Bridget M.; Oña, Leonardo; Burrows, Anne M.; Liebal, Katja
2016-01-01
Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely ‘responded to’ by the partner’s facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics. PMID:26978660
When Early Experiences Build a Wall to Others’ Emotions: An Electrophysiological and Autonomic Study
Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Sestito, Mariateresa; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions. PMID:23593374
Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders
Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini
2008-01-01
Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693
Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643
Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.
Dall'Oglio, Federica; Tedeschi, Aurora; Guardabasso, Vincenzo; Micali, Giuseppe
2015-09-01
To evaluate if nonprescription topical agents may provide positive outcomes in the management of mild-to-moderate facial seborrheic dermatitis by reducing inflammation and scale production through clinical evaluation and erythema-directed digital photography. Open-label, prospective, not-blinded, intra-patient, controlled, clinical trial (target area). Twenty adult subjects affected by mild-to-moderate facial seborrheic dermatitis were enrolled and instructed to apply the study cream two times daily, initially on a selected target area only for seven days. If the subject developed visible improvement, it was advised to extend the application to all facial affected area for 21 additional days. Efficacy was evaluated by measuring the grade of erythema (by clinical examination and by erythema-directed digital photography), desquamation (by clinical examination), and pruritus (by subject-completed visual analog scale). Additionally, at the end of the protocol, a Physician Global Assessment was carried out. Eighteen subjects completed the study, whereas two subjects were lost to follow-up for nonadherence and personal reasons, respectively. Day 7 data from target areas showed a significant reduction in erythema. At the end of study, a significant improvement was recorded for erythema, desquamation, and pruritus compared to baseline. Physician Global Assessment showed improvement in 89 percent of patients, with a complete response in 56 percent of cases. These preliminary results indicate that the study cream may be a viable nonprescription therapeutic option for patients affected by facial seborrheic dermatitis able to determine early and significant improvement. This study also emphasizes the advantages of using an erythema-directed digital photography system for assisting in a simple, more accurate erythema severity grading and therapeutic monitoring in patients affected by seborrheic dermatitis.
Weinreich, André; Funcke, Jakob Maria
2014-01-01
Drawing on recent findings, this study examines whether valence concordant electromyography (EMG) responses can be explained as an unconditional effect of mere stimulus processing or as somatosensory simulation driven by task-dependent processing strategies. While facial EMG over the Corrugator supercilii and the Zygomaticus major was measured, each participant performed two tasks with pictures of album covers. One task was an affective evaluation task and the other was to attribute the album covers to one of five decades. The Embodied Emotion Account predicts that valence concordant EMG is more likely to occur if the task necessitates a somatosensory simulation of the evaluative meaning of stimuli. Results support this prediction with regard to Corrugator supercilii in that valence concordant EMG activity was only present in the affective evaluation task but not in the non-evaluative task. Results for the Zygomaticus major were ambiguous. Our findings are in line with the view that EMG activity is an embodied part of the evaluation process and not a mere physical outcome.
Kluczniok, Dorothea; Hindi Attar, Catherine; Stein, Jenny; Poppinga, Sina; Fydrich, Thomas; Jaite, Charlotte; Kappel, Viola; Brunner, Romuald; Herpertz, Sabine C; Boedeker, Katja; Bermpohl, Felix
2017-01-01
Maternal sensitive behavior depends on recognizing one's own child's affective states. The present study investigated distinct and overlapping neural responses of mothers to sad and happy facial expressions of their own child (in comparison to facial expressions of an unfamiliar child). We used functional MRI to measure dissociable and overlapping activation patterns in 27 healthy mothers in response to happy, neutral and sad facial expressions of their own school-aged child and a gender- and age-matched unfamiliar child. To investigate differential activation to sad compared to happy faces of one's own child, we used interaction contrasts. During the scan, mothers had to indicate the affect of the presented face. After scanning, they were asked to rate the perceived emotional arousal and valence levels for each face using a 7-point Likert-scale (adapted SAM version). While viewing their own child's sad faces, mothers showed activation in the amygdala and anterior cingulate cortex whereas happy facial expressions of the own child elicited activation in the hippocampus. Conjoint activation in response to one's own child happy and sad expressions was found in the insula and the superior temporal gyrus. Maternal brain activations differed depending on the child's affective state. Sad faces of the own child activated areas commonly associated with a threat detection network, whereas happy faces activated reward related brain areas. Overlapping activation was found in empathy related networks. These distinct neural activation patterns might facilitate sensitive maternal behavior.
Sadness enhances the experience of pain and affects pain-evoked cortical activities: an MEG study.
Yoshino, Atsuo; Okamoto, Yasumasa; Onoda, Keiichi; Shishida, Kazuhiro; Yoshimura, Shinpei; Kunisato, Yoshihiko; Demoto, Yoshihiko; Okada, Go; Toki, Shigeru; Yamashita, Hidehisa; Yamawaki, Shigeto
2012-07-01
Pain is a multidimensional phenomenon. Previous psychological studies have shown that a person's subjective pain threshold can change when certain emotions are recognized. We examined this association with magnetoencephalography. Magnetic field strength was recorded with a 306-channel neuromagnetometer while 19 healthy subjects (7 female, 12 male; age range = 20-30 years) experienced pain stimuli in different emotional contexts induced by the presentation of sad, happy, or neutral facial stimuli. Subjects also rated their subjective pain intensity. We hypothesized that pain stimuli were affected by sadness induced by facial recognition. We found: 1) the intensity of subjective pain ratings increased in the sad emotional context compared to the happy and the neutral contexts, and 2) event-related desynchronization of lower beta bands in the right hemisphere after pain stimuli was larger in the sad emotional condition than in the happy emotional condition. Previous studies have shown that event-related desynchronization in these bands could be consistently observed over the primary somatosensory cortex. These findings suggest that sadness can modulate neural responses to pain stimuli, and that brain processing of pain stimuli had already been affected, at the level of the primary somatosensory cortex, which is critical for sensory processing of pain. We found that subjective pain ratings and cortical beta rhythms after pain stimuli are influenced by the sad emotional context. These results may contribute to understanding the broader relationship between pain and negative emotion. Copyright © 2012 American Pain Society. Published by Elsevier Inc. All rights reserved.
Automatic three-dimensional quantitative analysis for evaluation of facial movement.
Hontanilla, B; Aubá, C
2008-01-01
The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.
The Two Sides of Beauty: Laterality and the Duality of Facial Attractiveness
ERIC Educational Resources Information Center
Franklin, Robert G., Jr.; Adams, Reginald B., Jr.
2010-01-01
We hypothesized that facial attractiveness represents a dual judgment, a combination of reward-based, sexual processes, and aesthetic, cognitive processes. Herein we describe a study that demonstrates that sexual and nonsexual processes both contribute to attractiveness judgments and that these processes can be dissociated. Female participants…
Zaki, Victor
2017-02-01
This is a retrospective study aimed to determine the efficacy of mini-scleral contact lens in protecting the cornea and improving vision in cases of facial palsy. Patients with facial palsy get exposure keratitis because the cornea is dry. They feel pain, discomfort and excessive watering. If left untreated, it leads to permanent damage to the cornea and loss of good functional vision. Mini-scleral lens keep the cornea covered by saline solution all wearing hours. Three patients (4 eyes) with acoustic neuroma, two unilateral and one bilateral, who underwent acoustic neuroma surgeries resulting in facial palsy, are presented.The gold implant and lateral tarsorrhaphy were not enough for corneal protection.Two patients (patients 1 and 2) suffered continuous pain and watering. They had to apply thick lubricant, Lacri-Lube ointment (Allergan, Inc., Dublin, Ireland), several times daily to the affected eye for 15 years. The vision of these patients in the affected eyes were counting fingers (CF) at one foot.Patient 3 with bilateral facial palsy had exposure keratitis in both eyes resulting in constant watering, pain and blurred vision. The 4 eyes were fitted with mini-scleral lenses. The lenses were 15.8 mm rigid gas permeable filled with preservative free saline solution that continuously covers the cornea all wearing hours. In patients 1 and 2 with unilateral facial palsy, vision improved through the mini-scleral lenses to 20/30 and all their symptoms disappeared.The keratitis in case 3 with bilateral facial palsy disappeared within one week of mini-scleral lens use.Follow up for 2 years showed that these patients maintained good vision with no side effects. Mini-scleral lenses protected the cornea, gave comfort and improved the vision and the quality of life of these three patients with facial palsy and should be considered for all patients with facial palsy.
Jessen, Sarah; Grossmann, Tobias
2017-01-01
Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants ( N = 26). Our results revealed that infants' brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.
The influence of natural head position on the assessment of facial morphology.
Woźniak, Krzysztof; Piątkowska, Dagmara; Lipski, Mariusz
2012-01-01
Skeletal relationships play a major part in determining occlusal relationships, and that is why they also affect orthodontic treatment. Facial morphology can be assessed by clinical or radiological methods. Soft tissue analysis of the face is accepted as an integral part of orthodontic diagnosis and treatment planning. The aim of the study was to determine the impact of the inclination between the Frankfort horizontal(FH) and the extracranial horizontal (HOR) lines with the head in the natural position (NHP) on the assessment of facial morphology. Lateral facial photographs of 200 young adult males and females were taken with the head in the natural head position and then analyzed. Each image was rotated in order to position the Frankfort line parallel to the extracranial horizontal line. Twelve landmarks on each of the 400 profile photographs (200 originals,200 processed) were identified, and nine linear measurements and three angular measurements were assessed. The inclination angle between the extracranial horizontal line and the Frankfort horizontal line in the NHP varied from -7.1° to 5.6° (mean -1.20°). Significant correlations were found between the inclination angle FH/HOR and both sagittal and vertical morphology predictors such as the sections N-Sn (r = 0.3737, p = 0.0001), Sn-Gn(r = 0.3231, p = 0.0000), and both facial angles (r = 0.9774, p = 0.0000) and proflle angles (r = 0.9654, p = 0.0000). A comparison of soft tissue measurements determined with reference to the Frankfort horizontal and extracranial horizontal lines with the head in the natural position reveals significant differences
Emotion perception across cultures: the role of cognitive mechanisms
Engelmann, Jan B.; Pogosyan, Marianna
2012-01-01
Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception. PMID:23486743
Emotion perception across cultures: the role of cognitive mechanisms.
Engelmann, Jan B; Pogosyan, Marianna
2013-01-01
Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception.