Percinel, Ipek; Ozbaran, Burcu; Kose, Sezen; Simsek, Damla Goksen; Darcan, Sukran
2018-03-01
In this study we aimed to evaluate emotion recognition and emotion regulation skills of children with exogenous obesity between the ages of 11 and 18 years and compare them with healthy controls. The Schedule for Affective Disorders and Schizophrenia for School Aged Children was used for psychiatric evaluations. Emotion recognition skills were evaluated using Faces Test and Reading the Mind in the Eyes Test. The Difficulties in Emotions Regulation Scale was used for evaluating skills of emotion regulation. Children with obesity had lower scores on Faces Test and Reading the Mind in the Eyes Test, and experienced greater difficulty in emotional regulation skills. Improved understanding of emotional recognition and emotion regulation in young people with obesity may improve their social adaptation and help in the treatment of their disorder. To the best of our knowledge, this is the first study to evaluate both emotional recognition and emotion regulation functions in obese children and obese adolescents between 11 and 18 years of age.
Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel
2013-01-01
To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.
Neves, Maila de Castro Lourenço das; Tremeau, Fabien; Nicolato, Rodrigo; Lauar, Hélio; Romano-Silva, Marco Aurélio; Correa, Humberto
2011-09-01
A large body of evidence suggests that several aspects of face processing are impaired in autism and that this impairment might be hereditary. This study was aimed at assessing facial emotion recognition in parents of children with autism and its associations with a functional polymorphism of the serotonin transporter (5HTTLPR). We evaluated 40 parents of children with autism and 41 healthy controls. All participants were administered the Penn Emotion Recognition Test (ER40) and were genotyped for 5HTTLPR. Our study showed that parents of children with autism performed worse in the facial emotion recognition test than controls. Analyses of error patterns showed that parents of children with autism over-attributed neutral to emotional faces. We found evidence that 5HTTLPR polymorphism did not influence the performance in the Penn Emotion Recognition Test, but that it may determine different error patterns. Facial emotion recognition deficits are more common in first-degree relatives of autistic patients than in the general population, suggesting that facial emotion recognition is a candidate endophenotype for autism.
[Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].
Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel
2016-07-01
Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.
Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C
2013-05-15
Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (p<0.05) impaired recognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Improving Negative Emotion Recognition in Young Offenders Reduces Subsequent Crime
Hubble, Kelly; Bowen, Katharine L.; Moore, Simon C.; van Goozen, Stephanie H. M.
2015-01-01
Background Children with antisocial behaviour show deficits in the perception of emotional expressions in others that may contribute to the development and persistence of antisocial and aggressive behaviour. Current treatments for antisocial youngsters are limited in effectiveness. It has been argued that more attention should be devoted to interventions that target neuropsychological correlates of antisocial behaviour. This study examined the effect of emotion recognition training on criminal behaviour. Methods Emotion recognition and crime levels were studied in 50 juvenile offenders. Whilst all young offenders received their statutory interventions as the study was conducted, a subgroup of twenty-four offenders also took part in a facial affect training aimed at improving emotion recognition. Offenders in the training and control groups were matched for age, SES, IQ and lifetime crime level. All offenders were tested twice for emotion recognition performance, and recent crime data were collected after the testing had been completed. Results Before the training there were no differences between the groups in emotion recognition, with both groups displaying poor fear, sadness and anger recognition. After the training fear, sadness and anger recognition improved significantly in juvenile offenders in the training group. Although crime rates dropped in all offenders in the 6 months following emotion testing, only the group of offenders who had received the emotion training showed a significant reduction in the severity of the crimes they committed. Conclusions The study indicates that emotion recognition can be relatively easily improved in youths who engage in serious antisocial and criminal behavior. The results suggest that improved emotion recognition has the potential to reduce the severity of reoffending. PMID:26121148
Improving Negative Emotion Recognition in Young Offenders Reduces Subsequent Crime.
Hubble, Kelly; Bowen, Katharine L; Moore, Simon C; van Goozen, Stephanie H M
2015-01-01
Children with antisocial behaviour show deficits in the perception of emotional expressions in others that may contribute to the development and persistence of antisocial and aggressive behaviour. Current treatments for antisocial youngsters are limited in effectiveness. It has been argued that more attention should be devoted to interventions that target neuropsychological correlates of antisocial behaviour. This study examined the effect of emotion recognition training on criminal behaviour. Emotion recognition and crime levels were studied in 50 juvenile offenders. Whilst all young offenders received their statutory interventions as the study was conducted, a subgroup of twenty-four offenders also took part in a facial affect training aimed at improving emotion recognition. Offenders in the training and control groups were matched for age, SES, IQ and lifetime crime level. All offenders were tested twice for emotion recognition performance, and recent crime data were collected after the testing had been completed. Before the training there were no differences between the groups in emotion recognition, with both groups displaying poor fear, sadness and anger recognition. After the training fear, sadness and anger recognition improved significantly in juvenile offenders in the training group. Although crime rates dropped in all offenders in the 6 months following emotion testing, only the group of offenders who had received the emotion training showed a significant reduction in the severity of the crimes they committed. The study indicates that emotion recognition can be relatively easily improved in youths who engage in serious antisocial and criminal behavior. The results suggest that improved emotion recognition has the potential to reduce the severity of reoffending.
Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo
2016-01-01
Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s < 0.05). Patients also yielded worse Ekman global score and disgust, sadness, and fear sub-scores than healthy controls (all P s < 0.001). Altered facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.
McIntosh, Lindsey G; Mannava, Sishir; Camalier, Corrie R; Folley, Bradley S; Albritton, Aaron; Konrad, Peter E; Charles, David; Park, Sohee; Neimat, Joseph S
2014-01-01
Parkinson's disease (PD) is traditionally regarded as a neurodegenerative movement disorder, however, nigrostriatal dopaminergic degeneration is also thought to disrupt non-motor loops connecting basal ganglia to areas in frontal cortex involved in cognition and emotion processing. PD patients are impaired on tests of emotion recognition, but it is difficult to disentangle this deficit from the more general cognitive dysfunction that frequently accompanies disease progression. Testing for emotion recognition deficits early in the disease course, prior to cognitive decline, better assesses the sensitivity of these non-motor corticobasal ganglia-thalamocortical loops involved in emotion processing to early degenerative change in basal ganglia circuits. In addition, contrasting this with a group of healthy aging individuals demonstrates changes in emotion processing specific to the degeneration of basal ganglia circuitry in PD. Early PD patients (EPD) were recruited from a randomized clinical trial testing the safety and tolerability of deep brain stimulation (DBS) of the subthalamic nucleus (STN-DBS) in early-staged PD. EPD patients were previously randomized to receive optimal drug therapy only (ODT), or drug therapy plus STN-DBS (ODT + DBS). Matched healthy elderly controls (HEC) and young controls (HYC) also participated in this study. Participants completed two control tasks and three emotion recognition tests that varied in stimulus domain. EPD patients were impaired on all emotion recognition tasks compared to HEC. Neither therapy type (ODT or ODT + DBS) nor therapy state (ON/OFF) altered emotion recognition performance in this study. Finally, HEC were impaired on vocal emotion recognition relative to HYC, suggesting a decline related to healthy aging. This study supports the existence of impaired emotion recognition early in the PD course, implicating an early disruption of fronto-striatal loops mediating emotional function.
Inconsistent emotion recognition deficits across stimulus modalities in Huntington׳s disease.
Rees, Elin M; Farmer, Ruth; Cole, James H; Henley, Susie M D; Sprengelmeyer, Reiner; Frost, Chris; Scahill, Rachael I; Hobbs, Nicola Z; Tabrizi, Sarah J
2014-11-01
Recognition of negative emotions is impaired in Huntington׳s Disease (HD). It is unclear whether these emotion-specific problems are driven by dissociable cognitive deficits, emotion complexity, test cue difficulty, or visuoperceptual impairments. This study set out to further characterise emotion recognition in HD by comparing patterns of deficits across stimulus modalities; notably including for the first time in HD, the more ecologically and clinically relevant modality of film clips portraying dynamic facial expressions. Fifteen early HD and 17 control participants were tested on emotion recognition from static facial photographs, non-verbal vocal expressions and one second dynamic film clips, all depicting different emotions. Statistically significant evidence of impairment of anger, disgust and fear recognition was seen in HD participants compared with healthy controls across multiple stimulus modalities. The extent of the impairment, as measured by the difference in the number of errors made between HD participants and controls, differed according to the combination of emotion and modality (p=0.013, interaction test). The largest between-group difference was seen in the recognition of anger from film clips. Consistent with previous reports, anger, disgust and fear were the most poorly recognised emotions by the HD group. This impairment did not appear to be due to task demands or expression complexity as the pattern of between-group differences did not correspond to the pattern of errors made by either group; implicating emotion-specific cognitive processing pathology. There was however evidence that the extent of emotion recognition deficits significantly differed between stimulus modalities. The implications in terms of designing future tests of emotion recognition and care giving are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M
2013-01-01
This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.
Farsham, Aida; Abbaslou, Tahereh; Bidaki, Reza; Bozorg, Bonnie
2017-01-01
Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD) and schizotypal personality disorder (SPD). The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method: Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants. Discussion: The results of one way ANOVA and Scheffe’s post hoc test analysis revealed significant differences in neuropsychology assessment of facial emotional recognition between BPD and SPD patients with normal group (p = 0/001). A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008). A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(. The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain. PMID:28659980
Aging and Emotion Recognition: Not Just a Losing Matter
Sze, Jocelyn A.; Goodkind, Madeleine S.; Gyurak, Anett; Levenson, Robert W.
2013-01-01
Past studies on emotion recognition and aging have found evidence of age-related decline when emotion recognition was assessed by having participants detect single emotions depicted in static images of full or partial (e.g., eye region) faces. These tests afford good experimental control but do not capture the dynamic nature of real-world emotion recognition, which is often characterized by continuous emotional judgments and dynamic multi-modal stimuli. Research suggests that older adults often perform better under conditions that better mimic real-world social contexts. We assessed emotion recognition in young, middle-aged, and older adults using two traditional methods (single emotion judgments of static images of faces and eyes) and an additional method in which participants made continuous emotion judgments of dynamic, multi-modal stimuli (videotaped interactions between young, middle-aged, and older couples). Results revealed an age by test interaction. Largely consistent with prior research, we found some evidence that older adults performed worse than young adults when judging single emotions from images of faces (for sad and disgust faces only) and eyes (for older eyes only), with middle-aged adults falling in between. In contrast, older adults did better than young adults on the test involving continuous emotion judgments of dyadic interactions, with middle-aged adults falling in between. In tests in which target stimuli differed in age, emotion recognition was not facilitated by an age match between participant and target. These findings are discussed in terms of theoretical and methodological implications for the study of aging and emotional processing. PMID:22823183
A multimodal approach to emotion recognition ability in autism spectrum disorders.
Jones, Catherine R G; Pickles, Andrew; Falcaro, Milena; Marsden, Anita J S; Happé, Francesca; Scott, Sophie K; Sauter, Disa; Tregay, Jenifer; Phillips, Rebecca J; Baird, Gillian; Simonoff, Emily; Charman, Tony
2011-03-01
Autism spectrum disorders (ASD) are characterised by social and communication difficulties in day-to-day life, including problems in recognising emotions. However, experimental investigations of emotion recognition ability in ASD have been equivocal, hampered by small sample sizes, narrow IQ range and over-focus on the visual modality. We tested 99 adolescents (mean age 15;6 years, mean IQ 85) with an ASD and 57 adolescents without an ASD (mean age 15;6 years, mean IQ 88) on a facial emotion recognition task and two vocal emotion recognition tasks (one verbal; one non-verbal). Recognition of happiness, sadness, fear, anger, surprise and disgust were tested. Using structural equation modelling, we conceptualised emotion recognition ability as a multimodal construct, measured by the three tasks. We examined how the mean levels of recognition of the six emotions differed by group (ASD vs. non-ASD) and IQ (≥ 80 vs. < 80). We found no evidence of a fundamental emotion recognition deficit in the ASD group and analysis of error patterns suggested that the ASD group were vulnerable to the same pattern of confusions between emotions as the non-ASD group. However, recognition ability was significantly impaired in the ASD group for surprise. IQ had a strong and significant effect on performance for the recognition of all six emotions, with higher IQ adolescents outperforming lower IQ adolescents. The findings do not suggest a fundamental difficulty with the recognition of basic emotions in adolescents with ASD. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.
Emotional Recognition in Autism Spectrum Conditions from Voices and Faces
ERIC Educational Resources Information Center
Stewart, Mary E.; McAdam, Clair; Ota, Mitsuhiko; Peppe, Sue; Cleland, Joanne
2013-01-01
The present study reports on a new vocal emotion recognition task and assesses whether people with autism spectrum conditions (ASC) perform differently from typically developed individuals on tests of emotional identification from both the face and the voice. The new test of vocal emotion contained trials in which the vocal emotion of the sentence…
Canli, Derya; Ozdemir, Hatice; Kocak, Orhan Murat
2015-08-01
Studies provide evidence for impaired social cognition in schizotypy and its association with negative symptoms. Cognitive features related to magical ideation - a component of the positive dimension of schizotypy - have been less investigated. We aimed to assess social cognitive functioning among adolescents with high magical ideation scores, mainly focusing on face and emotion recognition. 22 subjects with magical ideation scale scores above the cut off level and 22 controls with lowest scores from among 250 students screened with this scale were included in the study. A face and emotion recognition n-back test, the empathy quotient, theory of mind tests and the Physical Anhedonia Scale were applied to both magical ideation and control groups. The magical ideation group performed significantly worse than controls on both face and emotion recognition tests. Emotion recognition performance was found to be affected by memory load, with sadness, among emotions, revealing a difference between the two groups. Empathy and theory of mind tests did not distinguish the magical ideation group from controls. Our findings provide evidence for a deficit in negative emotion recognition affected by memory load associated with magical ideation in adolescents. Copyright © 2015 Elsevier Inc. All rights reserved.
Instructions to mimic improve facial emotion recognition in people with sub-clinical autism traits.
Lewis, Michael B; Dunn, Emily
2017-11-01
People tend to mimic the facial expression of others. It has been suggested that this helps provide social glue between affiliated people but it could also aid recognition of emotions through embodied cognition. The degree of facial mimicry, however, varies between individuals and is limited in people with autism spectrum conditions (ASC). The present study sought to investigate the effect of promoting facial mimicry during a facial-emotion-recognition test. In two experiments, participants without an ASC diagnosis had their autism quotient (AQ) measured. Following a baseline test, they did an emotion-recognition test again but half of the participants were asked to mimic the target face they saw prior to making their responses. Mimicry improved emotion recognition, and further analysis revealed that the largest improvement was for participants who had higher scores on the autism traits. In fact, recognition performance was best overall for people who had high AQ scores but also received the instruction to mimic. Implications for people with ASC are explored.
Stereotypes and prejudice affect the recognition of emotional body postures.
Bijlstra, Gijsbert; Holland, Rob W; Dotsch, Ron; Wigboldus, Daniel H J
2018-03-26
Most research on emotion recognition focuses on facial expressions. However, people communicate emotional information through bodily cues as well. Prior research on facial expressions has demonstrated that emotion recognition is modulated by top-down processes. Here, we tested whether this top-down modulation generalizes to the recognition of emotions from body postures. We report three studies demonstrating that stereotypes and prejudice about men and women may affect how fast people classify various emotional body postures. Our results suggest that gender cues activate gender associations, which affect the recognition of emotions from body postures in a top-down fashion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Castro-Vale, Ivone; Severo, Milton; Carvalho, Davide; Mota-Cardoso, Rui
2015-01-01
Emotion recognition is very important for social interaction. Several mental disorders influence facial emotion recognition. War veterans and their offspring are subject to an increased risk of developing psychopathology. Emotion recognition is an important aspect that needs to be addressed in this population. To our knowledge, no test exists that is validated for use with war veterans and their offspring. The current study aimed to validate the JACFEE photo set to study facial emotion recognition in war veterans and their offspring. The JACFEE photo set was presented to 135 participants, comprised of 62 male war veterans and 73 war veterans' offspring. The participants identified the facial emotion presented from amongst the possible seven emotions that were tested for: anger, contempt, disgust, fear, happiness, sadness, and surprise. A loglinear model was used to evaluate whether the agreement between the intended and the chosen emotions was higher than the expected. Overall agreement between chosen and intended emotions was 76.3% (Cohen kappa = 0.72). The agreement ranged from 63% (sadness expressions) to 91% (happiness expressions). The reliability by emotion ranged from 0.617 to 0.843 and the overall JACFEE photo set Cronbach alpha was 0.911. The offspring showed higher agreement when compared with the veterans (RR: 41.52 vs 12.12, p < 0.001), which confirms the construct validity of the test. The JACFEE set of photos showed good validity and reliability indices, which makes it an adequate instrument for researching emotion recognition ability in the study sample of war veterans and their respective offspring.
Castro-Vale, Ivone; Severo, Milton; Carvalho, Davide; Mota-Cardoso, Rui
2015-01-01
Emotion recognition is very important for social interaction. Several mental disorders influence facial emotion recognition. War veterans and their offspring are subject to an increased risk of developing psychopathology. Emotion recognition is an important aspect that needs to be addressed in this population. To our knowledge, no test exists that is validated for use with war veterans and their offspring. The current study aimed to validate the JACFEE photo set to study facial emotion recognition in war veterans and their offspring. The JACFEE photo set was presented to 135 participants, comprised of 62 male war veterans and 73 war veterans’ offspring. The participants identified the facial emotion presented from amongst the possible seven emotions that were tested for: anger, contempt, disgust, fear, happiness, sadness, and surprise. A loglinear model was used to evaluate whether the agreement between the intended and the chosen emotions was higher than the expected. Overall agreement between chosen and intended emotions was 76.3% (Cohen kappa = 0.72). The agreement ranged from 63% (sadness expressions) to 91% (happiness expressions). The reliability by emotion ranged from 0.617 to 0.843 and the overall JACFEE photo set Cronbach alpha was 0.911. The offspring showed higher agreement when compared with the veterans (RR: 41.52 vs 12.12, p < 0.001), which confirms the construct validity of the test. The JACFEE set of photos showed good validity and reliability indices, which makes it an adequate instrument for researching emotion recognition ability in the study sample of war veterans and their respective offspring. PMID:26147938
Lysaker, Paul H; Leonhardt, Bethany L; Brüne, Martin; Buck, Kelly D; James, Alison; Vohs, Jenifer; Francis, Michael; Hamm, Jay A; Salvatore, Giampaolo; Ringer, Jamie M; Dimaggio, Giancarlo
2014-09-30
While many with schizophrenia spectrum disorders experience difficulties understanding the feelings of others, little is known about the psychological antecedents of these deficits. To explore these issues we examined whether deficits in mental state decoding, mental state reasoning and metacognitive capacity predict performance on an emotion recognition task. Participants were 115 adults with a schizophrenia spectrum disorder and 58 adults with substance use disorders but no history of a diagnosis of psychosis who completed the Eyes and Hinting Test. Metacognitive capacity was assessed using the Metacognitive Assessment Scale Abbreviated and emotion recognition was assessed using the Bell Lysaker Emotion Recognition Test. Results revealed that the schizophrenia patients performed more poorly than controls on tests of emotion recognition, mental state decoding, mental state reasoning and metacognition. Lesser capacities for mental state decoding, mental state reasoning and metacognition were all uniquely related emotion recognition within the schizophrenia group even after controlling for neurocognition and symptoms in a stepwise multiple regression. Results suggest that deficits in emotion recognition in schizophrenia may partly result from a combination of impairments in the ability to judge the cognitive and affective states of others and difficulties forming complex representations of self and others. Published by Elsevier Ireland Ltd.
Recognition of emotion with temporal lobe epilepsy and asymmetrical amygdala damage.
Fowler, Helen L; Baker, Gus A; Tipples, Jason; Hare, Dougal J; Keller, Simon; Chadwick, David W; Young, Andrew W
2006-08-01
Impairments in emotion recognition occur when there is bilateral damage to the amygdala. In this study, ability to recognize auditory and visual expressions of emotion was investigated in people with asymmetrical amygdala damage (AAD) and temporal lobe epilepsy (TLE). Recognition of five emotions was tested across three participant groups: those with right AAD and TLE, those with left AAD and TLE, and a comparison group. Four tasks were administered: recognition of emotion from facial expressions, sentences describing emotion-laden situations, nonverbal sounds, and prosody. Accuracy scores for each task and emotion were analysed, and no consistent overall effect of AAD on emotion recognition was found. However, some individual participants with AAD were significantly impaired at recognizing emotions, in both auditory and visual domains. The findings indicate that a minority of individuals with AAD have impairments in emotion recognition, but no evidence of specific impairments (e.g., visual or auditory) was found.
Impairment in the recognition of emotion across different media following traumatic brain injury.
Williams, Claire; Wood, Rodger Ll
2010-02-01
The current study examined emotion recognition following traumatic brain injury (TBI) and examined whether performance differed according to the affective valence and type of media presentation of the stimuli. A total of 64 patients with TBI and matched controls completed the Emotion Evaluation Test (EET) and Ekman 60 Faces Test (E-60-FT). Patients with TBI also completed measures of information processing and verbal ability. Results revealed that the TBI group were significantly impaired compared to controls when recognizing emotion on the EET and E-60-FT. A significant main effect of valence was found in both groups, with poor recognition of negative emotions. However, the difference between the recognition of positive and negative emotions was larger in the TBI group. The TBI group were also more accurate recognizing emotion displayed in audiovisual media (EET) than that displayed in still media (E-60-FT). No significant relationship was obtained between emotion recognition tasks and information-processing speed. A significant positive relationship was found between the E-60-FT and one measure of verbal ability. These findings support models of emotion that specify separate neurological pathways for certain emotions and different media and confirm that patients with TBI are vulnerable to experiencing emotion recognition difficulties.
Recognition of facial and musical emotions in Parkinson's disease.
Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N
2013-03-01
Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.
Besche-Richard, C; Bourrin-Tisseron, A; Olivier, M; Cuervo-Lombard, C-V; Limosin, F
2012-06-01
The deficits of recognition of facial emotions and attribution of mental states are now well-documented in schizophrenic patients. However, we don't clearly know about the link between these two complex cognitive functions, especially in schizophrenia. In this study, we attempted to test the link between the recognition of facial emotions and the capacities of mentalization, notably the attribution of beliefs, in health and schizophrenic participants. We supposed that the level of performance of recognition of facial emotions, compared to the working memory and executive functioning, was the best predictor of the capacities to attribute a belief. Twenty schizophrenic participants according to DSM-IVTR (mean age: 35.9 years, S.D. 9.07; mean education level: 11.15 years, S.D. 2.58) clinically stabilized, receiving neuroleptic or antipsychotic medication participated in the study. They were matched on age (mean age: 36.3 years, S.D. 10.9) and educational level (mean educational level: 12.10, S.D. 2.25) with 30 matched healthy participants. All the participants were evaluated with a pool of tasks testing the recognition of facial emotions (the faces of Baron-Cohen), the attribution of beliefs (two stories of first order and two stories of second order), the working memory (the digit span of the WAIS-III and the Corsi test) and the executive functioning (Trail Making Test A et B, Wisconsin Card Sorting Test brief version). Comparing schizophrenic and healthy participants, our results confirmed a difference between the performances of the recognition of facial emotions and those of the attribution of beliefs. The result of the simple linear regression showed that the recognition of facial emotions, compared to the performances of working memory and executive functioning, was the best predictor of the performances in the theory of mind stories. Our results confirmed, in a sample of schizophrenic patients, the deficits in the recognition of facial emotions and in the attribution of mental states. Our new result concerned the demonstration that the performances in the recognition of facial emotions are the best predictor of the performances in the attribution of beliefs. With Marshall et al.'s model on empathy, we can explain this link between the recognition of facial emotions and the comprehension of beliefs. Copyright © 2011 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Trinkler, Iris; Cleret de Langavant, Laurent; Bachoud-Lévi, Anne-Catherine
2013-02-01
Patients with Huntington's disease (HD), a neurodegenerative disorder that causes major motor impairments, also show cognitive and emotional deficits. While their deficit in recognising emotions has been explored in depth, little is known about their ability to express emotions and understand their feelings. If these faculties were impaired, patients might not only mis-read emotion expressions in others but their own emotions might be mis-interpreted by others as well, or thirdly, they might have difficulties understanding and describing their feelings. We compared the performance of recognition and expression of facial emotions in 13 HD patients with mild motor impairments but without significant bucco-facial abnormalities, and 13 controls matched for age and education. Emotion recognition was investigated in a forced-choice recognition test (FCR), and emotion expression by filming participants while they mimed the six basic emotional facial expressions (anger, disgust, fear, surprise, sadness and joy) to the experimenter. The films were then segmented into 60 stimuli per participant and four external raters performed a FCR on this material. Further, we tested understanding of feelings in self (alexithymia) and others (empathy) using questionnaires. Both recognition and expression were impaired across different emotions in HD compared to controls and recognition and expression scores were correlated. By contrast, alexithymia and empathy scores were very similar in HD and controls. This might suggest that emotion deficits in HD might be tied to the expression itself. Because similar emotion recognition-expression deficits are also found in Parkinson's Disease and vascular lesions of the striatum, our results further confirm the importance of the striatum for emotion recognition and expression, while access to the meaning of feelings relies on a different brain network, and is spared in HD. Copyright © 2011 Elsevier Ltd. All rights reserved.
Action and emotion recognition from point light displays: an investigation of gender differences.
Alaerts, Kaat; Nackaerts, Evelien; Meyns, Pieter; Swinnen, Stephan P; Wenderoth, Nicole
2011-01-01
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as 'reading' the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of 'biological motion' versus 'non-biological' (or 'scrambled' motion); or (ii) the recognition of the 'emotional state' of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the 'Reading the Mind in the Eyes Test' (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree - be related to more basic differences in processing biological motion per se.
Fink, Elian; de Rosnay, Marc; Wierda, Marlies; Koot, Hans M; Begeer, Sander
2014-09-01
The empirical literature has presented inconsistent evidence for deficits in the recognition of basic emotion expressions in children with autism spectrum disorders (ASD), which may be due to the focus on research with relatively small sample sizes. Additionally, it is proposed that although children with ASD may correctly identify emotion expression they rely on more deliberate, more time-consuming strategies in order to accurately recognize emotion expressions when compared to typically developing children. In the current study, we examine both emotion recognition accuracy and response time in a large sample of children, and explore the moderating influence of verbal ability on these findings. The sample consisted of 86 children with ASD (M age = 10.65) and 114 typically developing children (M age = 10.32) between 7 and 13 years of age. All children completed a pre-test (emotion word-word matching), and test phase consisting of basic emotion recognition, whereby they were required to match a target emotion expression to the correct emotion word; accuracy and response time were recorded. Verbal IQ was controlled for in the analyses. We found no evidence of a systematic deficit in emotion recognition accuracy or response time for children with ASD, controlling for verbal ability. However, when controlling for children's accuracy in word-word matching, children with ASD had significantly lower emotion recognition accuracy when compared to typically developing children. The findings suggest that the social impairments observed in children with ASD are not the result of marked deficits in basic emotion recognition accuracy or longer response times. However, children with ASD may be relying on other perceptual skills (such as advanced word-word matching) to complete emotion recognition tasks at a similar level as typically developing children.
Wang, Pengyun; Li, Juan; Li, Huijie; Li, Bing; Jiang, Yang; Bao, Feng; Zhang, Shouzi
2013-11-01
This study investigated whether the observed absence of emotional memory enhancement in recognition tasks in patients with amnestic mild cognitive impairment (aMCI) could be related to their greater proportion of familiarity-based responses for all stimuli, and whether recognition tests with emotional items had better discriminative power for aMCI patients than those with neutral items. In total, 31 aMCI patients and 30 healthy older adults participated in a recognition test followed by remember/know judgments. Positive, neutral, and negative faces were used as stimuli. For overall recognition performance, emotional memory enhancement was found only in healthy controls; they remembered more negative and positive stimuli than neutral ones. For "remember" responses, we found equivalent emotional memory enhancement in both groups, though a greater proportion of "remember" responses was observed in normal controls. For "know" responses, aMCI patients presented a larger proportion than normal controls did, and their "know" responses were not affected by emotion. A negative correlation was found between emotional enhancement effect and the memory performance related to "know" responses. In addition, receiver operating characteristic curve analysis revealed higher diagnostic accuracy for recognition test with emotional stimuli than with neutral stimuli. The present results implied that the absence of the emotional memory enhancement effect in aMCI patients might be related to their tendency to rely more on familiarity-based "know" responses for all stimuli. Furthermore, recognition memory tests using emotional stimuli may be better able than neutral stimuli to differentiate people with aMCI from cognitively normal older adults. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Kumfor, Fiona; Irish, Muireann; Hodges, John R.; Piguet, Olivier
2013-01-01
Patients with frontotemporal dementia have pervasive changes in emotion recognition and social cognition, yet the neural changes underlying these emotion processing deficits remain unclear. The multimodal system model of emotion proposes that basic emotions are dependent on distinct brain regions, which undergo significant pathological changes in frontotemporal dementia. As such, this syndrome may provide important insight into the impact of neural network degeneration upon the innate ability to recognise emotions. This study used voxel-based morphometry to identify discrete neural correlates involved in the recognition of basic emotions (anger, disgust, fear, sadness, surprise and happiness) in frontotemporal dementia. Forty frontotemporal dementia patients (18 behavioural-variant, 11 semantic dementia, 11 progressive nonfluent aphasia) and 27 healthy controls were tested on two facial emotion recognition tasks: The Ekman 60 and Ekman Caricatures. Although each frontotemporal dementia group showed impaired recognition of negative emotions, distinct associations between emotion-specific task performance and changes in grey matter intensity emerged. Fear recognition was associated with the right amygdala; disgust recognition with the left insula; anger recognition with the left middle and superior temporal gyrus; and sadness recognition with the left subcallosal cingulate, indicating that discrete neural substrates are necessary for emotion recognition in frontotemporal dementia. The erosion of emotion-specific neural networks in neurodegenerative disorders may produce distinct profiles of performance that are relevant to understanding the neurobiological basis of emotion processing. PMID:23805313
Buunk, Anne M; Groen, Rob J M; Veenstra, Wencke S; Metzemaekers, Jan D M; van der Hoeven, Johannes H; van Dijk, J Marc C; Spikman, Jacoba M
2016-11-01
The authors' aim was to investigate cognitive outcome in patients with aneurysmal and angiographically negative subarachnoid hemorrhage (aSAH and anSAH), by comparing them to healthy controls and to each other. Besides investigating cognitive functions as memory and attention, they focused on higher-order prefrontal functions, namely executive functioning (EF) and emotion recognition. Patients and healthy controls were assessed with tests measuring memory (15 Words Test, Digit Span), attention and processing speed (Trail Making Test A and B), EF (Zoo Map, Letter Fluency, Dysexecutive Questionnaire), and emotion recognition (Facial Expressions of Emotion Stimuli and Tests). Between-groups comparisons of test performances were made. Patients with aSAH scored significantly lower than healthy controls on measures of memory, processing speed, and attention, but anSAH patients did not. In the higher-order prefrontal functions (EF and emotion recognition), aSAH patients were clearly impaired when compared to healthy controls. However, anSAH patients did not perform significantly better than aSAH patients on the majority of the tests. In the subacute phase after SAH, cognitive functions, including the higher-order prefrontal functions EF and emotion recognition, were clearly impaired in aSAH patients. Patients with anSAH did not perform better than aSAH patients, which indicates that these functions may also be affected to some extent in anSAH patients. Considering the importance of these higher-order prefrontal functions for daily life functioning, and following the results of the present study, tests that measure emotion recognition and EF should be part of the standard neuropsychological assessment after SAH. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.
Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M
2014-11-01
Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.
The Relationships among Facial Emotion Recognition, Social Skills, and Quality of Life.
ERIC Educational Resources Information Center
Simon, Elliott W.; And Others
1995-01-01
Forty-six institutionalized adults with mild or moderate mental retardation were administered the Vineland Adaptive Behavior Scales (socialization domain), a subjective measure of quality of life, and a facial emotion recognition test. Facial emotion recognition, quality of life, and social skills appeared to be independent of one another. Facial…
Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition
ERIC Educational Resources Information Center
Freitag, Claudia; Schwarzer, Gudrun
2011-01-01
Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…
Mirandola, Chiara; Toffalini, Enrico; Grassano, Massimo; Cornoldi, Cesare; Melinder, Annika
2014-01-01
The present experiment was conducted to investigate whether negative emotionally charged and arousing content of to-be-remembered scripted material would affect propensity towards memory distortions. We further investigated whether elaboration of the studied material through free recall would affect the magnitude of memory errors. In this study participants saw eight scripts. Each of the scripts included an effect of an action, the cause of which was not presented. Effects were either negatively emotional or neutral. Participants were assigned to either a yes/no recognition test group (recognition), or to a recall and yes/no recognition test group (elaboration + recognition). Results showed that participants in the recognition group produced fewer memory errors in the emotional condition. Conversely, elaboration + recognition participants had lower accuracy and produced more emotional memory errors than the other group, suggesting a mediating role of semantic elaboration on the generation of false memories. The role of emotions and semantic elaboration on the generation of false memories is discussed.
Koelkebeck, Katja; Kohl, Waldemar; Luettgenau, Julia; Triantafillou, Susanna; Ohrmann, Patricia; Satoh, Shinji; Minoshita, Seiko
2015-07-30
A novel emotion recognition task that employs photos of a Japanese mask representing a highly ambiguous stimulus was evaluated. As non-Asians perceive and/or label emotions differently from Asians, we aimed to identify patterns of task-performance in non-Asian healthy volunteers with a view to future patient studies. The Noh mask test was presented to 42 adult German participants. Reaction times and emotion attribution patterns were recorded. To control for emotion identification abilities, a standard emotion recognition task was used among others. Questionnaires assessed personality traits. Finally, results were compared to age- and gender-matched Japanese volunteers. Compared to other tasks, German participants displayed slowest reaction times on the Noh mask test, indicating higher demands of ambiguous emotion recognition. They assigned more positive emotions to the mask than Japanese volunteers, demonstrating culture-dependent emotion identification patterns. As alexithymic and anxious traits were associated with slower reaction times, personality dimensions impacted on performance, as well. We showed an advantage of ambiguous over conventional emotion recognition tasks. Moreover, we determined emotion identification patterns in Western individuals impacted by personality dimensions, suggesting performance differences in clinical samples. Due to its properties, the Noh mask test represents a promising tool in the differential diagnosis of psychiatric disorders, e.g. schizophrenia. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Emotion Recognition in Frontotemporal Dementia and Alzheimer's Disease: A New Film-Based Assessment
Goodkind, Madeleine S.; Sturm, Virginia E.; Ascher, Elizabeth A.; Shdo, Suzanne M.; Miller, Bruce L.; Rankin, Katherine P.; Levenson, Robert W.
2015-01-01
Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. PMID:26010574
Modeling recall memory for emotional objects in Alzheimer's disease.
Sundstrøm, Martin
2011-07-01
To examine whether emotional memory (EM) of objects with self-reference in Alzheimer's disease (AD) can be modeled with binomial logistic regression in a free recall and an object recognition test to predict EM enhancement. Twenty patients with AD and twenty healthy controls were studied. Six objects (three presented as gifts) were shown to each participant. Ten minutes later, a free recall and a recognition test were applied. The recognition test had target-objects mixed with six similar distracter objects. Participants were asked to name any object in the recall test and identify each object in the recognition test as known or unknown. The total of gift objects recalled in AD patients (41.6%) was larger than neutral objects (13.3%) and a significant EM recall effect for gifts was found (Wilcoxon: p < .003). EM was not found for recognition in AD patients due to a ceiling effect. Healthy older adults scored overall higher in recall and recognition but showed no EM enhancement due to a ceiling effect. A logistic regression showed that likelihood of emotional recall memory can be modeled as a function of MMSE score (p < .014) and object status (p < .0001) as gift or non-gift. Recall memory was enhanced in AD patients for emotional objects indicating that EM in mild to moderate AD although impaired can be provoked with strong emotional load. The logistic regression model suggests that EM declines with the progression of AD rather than disrupts and may be a useful tool for evaluating magnitude of emotional load.
Emotion recognition pattern in adolescent boys with attention-deficit/hyperactivity disorder.
Aspan, Nikoletta; Bozsik, Csilla; Gadoros, Julia; Nagy, Peter; Inantsy-Pap, Judit; Vida, Peter; Halasz, Jozsef
2014-01-01
Social and emotional deficits were recently considered as inherent features of individuals with attention-deficit hyperactivity disorder (ADHD), but only sporadic literature data exist on emotion recognition in adolescents with ADHD. The aim of the present study was to establish emotion recognition profile in adolescent boys with ADHD in comparison with control adolescents. Forty-four adolescent boys (13-16 years) participated in the study after informed consent; 22 boys had a clinical diagnosis of ADHD, while data were also assessed from 22 adolescent control boys matched for age and Raven IQ. Parent- and self-reported behavioral characteristics were assessed by the means of the Strengths and Difficulties Questionnaire. The recognition of six basic emotions was evaluated by the "Facial Expressions of Emotion-Stimuli and Tests." Compared to controls, adolescents with ADHD were more sensitive in the recognition of disgust and, worse in the recognition of fear and showed a tendency for impaired recognition of sadness. Hyperactivity measures showed an inverse correlation with fear recognition. Our data suggest that adolescent boys with ADHD have alterations in the recognition of specific emotions.
Jehna, Margit; Neuper, Christa; Petrovic, Katja; Wallner-Blazek, Mirja; Schmidt, Reinhold; Fuchs, Siegrid; Fazekas, Franz; Enzinger, Christian
2010-07-01
Multiple sclerosis (MS) is a chronic multifocal CNS disorder which can affect higher order cognitive processes. Whereas cognitive disturbances in MS are increasingly better characterised, emotional facial expression (EFE) has rarely been tested, despite its importance for adequate social behaviour. We tested 20 patients with a clinically isolated syndrome suggestive of MS (CIS) or MS and 23 healthy controls (HC) for the ability to differ between emotional facial stimuli, controlling for the influence of depressive mood (ADS-L). We screened for cognitive dysfunction using The Faces Symbol Test (FST). The patients demonstrated significant decreased reaction-times regarding emotion recognition tests compared to HC. However, the results also suggested worse cognitive abilities in the patients. Emotional and cognitive test results were correlated. This exploratory pilot study suggests that emotion recognition deficits might be prevalent in MS. However, future studies will be needed to overcome the limitations of this study. Copyright 2010 Elsevier B.V. All rights reserved.
Golan, Ofer; Baron-Cohen, Simon; Hill, Jacqueline
2006-02-01
Adults with Asperger Syndrome (AS) can recognise simple emotions and pass basic theory of mind tasks, but have difficulties recognising more complex emotions and mental states. This study describes a new battery of tasks, testing recognition of 20 complex emotions and mental states from faces and voices. The battery was given to males and females with AS and matched controls. Results showed the AS group performed worse than controls overall, on emotion recognition from faces and voices and on 12/20 specific emotions. Females recognised faces better than males regardless of diagnosis, and males with AS had more difficulties recognising emotions from faces than from voices. The implications of these results are discussed in relation to social functioning in AS.
Gender differences in facial emotion recognition in persons with chronic schizophrenia.
Weiss, Elisabeth M; Kohler, Christian G; Brensinger, Colleen M; Bilker, Warren B; Loughead, James; Delazer, Margarete; Nolan, Karen A
2007-03-01
The aim of the present study was to investigate possible sex differences in the recognition of facial expressions of emotion and to investigate the pattern of classification errors in schizophrenic males and females. Such an approach provides an opportunity to inspect the degree to which males and females differ in perceiving and interpreting the different emotions displayed to them and to analyze which emotions are most susceptible to recognition errors. Fifty six chronically hospitalized schizophrenic patients (38 men and 18 women) completed the Penn Emotion Recognition Test (ER40), a computerized emotion discrimination test presenting 40 color photographs of evoked happy, sad, anger, fear expressions and neutral expressions balanced for poser gender and ethnicity. We found a significant sex difference in the patterns of error rates in the Penn Emotion Recognition Test. Neutral faces were more commonly mistaken as angry in schizophrenic men, whereas schizophrenic women misinterpreted neutral faces more frequently as sad. Moreover, female faces were better recognized overall, but fear was better recognized in same gender photographs, whereas anger was better recognized in different gender photographs. The findings of the present study lend support to the notion that sex differences in aggressive behavior could be related to a cognitive style characterized by hostile attributions to neutral faces in schizophrenic men.
Does cortisol modulate emotion recognition and empathy?
Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja
2016-04-01
Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hargreaves, A; Mothersill, O; Anderson, M; Lawless, S; Corvin, A; Donohoe, G
2016-10-28
Deficits in facial emotion recognition have been associated with functional impairments in patients with Schizophrenia (SZ). Whilst a strong ecological argument has been made for the use of both dynamic facial expressions and varied emotion intensities in research, SZ emotion recognition studies to date have primarily used static stimuli of a singular, 100%, intensity of emotion. To address this issue, the present study aimed to investigate accuracy of emotion recognition amongst patients with SZ and healthy subjects using dynamic facial emotion stimuli of varying intensities. To this end an emotion recognition task (ERT) designed by Montagne (2007) was adapted and employed. 47 patients with a DSM-IV diagnosis of SZ and 51 healthy participants were assessed for emotion recognition. Results of the ERT were tested for correlation with performance in areas of cognitive ability typically found to be impaired in psychosis, including IQ, memory, attention and social cognition. Patients were found to perform less well than healthy participants at recognising each of the 6 emotions analysed. Surprisingly, however, groups did not differ in terms of impact of emotion intensity on recognition accuracy; for both groups higher intensity levels predicted greater accuracy, but no significant interaction between diagnosis and emotional intensity was found for any of the 6 emotions. Accuracy of emotion recognition was, however, more strongly correlated with cognition in the patient cohort. Whilst this study demonstrates the feasibility of using ecologically valid dynamic stimuli in the study of emotion recognition accuracy, varying the intensity of the emotion displayed was not demonstrated to impact patients and healthy participants differentially, and thus may not be a necessary variable to include in emotion recognition research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Recognition of face identity and emotion in expressive specific language impairment.
Merkenschlager, A; Amorosa, H; Kiefl, H; Martinius, J
2012-01-01
To study face and emotion recognition in children with mostly expressive specific language impairment (SLI-E). A test movie to study perception and recognition of faces and mimic-gestural expression was applied to 24 children diagnosed as suffering from SLI-E and an age-matched control group of normally developing children. Compared to a normal control group, the SLI-E children scored significantly worse in both the face and expression recognition tasks with a preponderant effect on emotion recognition. The performance of the SLI-E group could not be explained by reduced attention during the test session. We conclude that SLI-E is associated with a deficiency in decoding non-verbal emotional facial and gestural information, which might lead to profound and persistent problems in social interaction and development. Copyright © 2012 S. Karger AG, Basel.
Emotion recognition in frontotemporal dementia and Alzheimer's disease: A new film-based assessment.
Goodkind, Madeleine S; Sturm, Virginia E; Ascher, Elizabeth A; Shdo, Suzanne M; Miller, Bruce L; Rankin, Katherine P; Levenson, Robert W
2015-08-01
Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. (c) 2015 APA, all rights reserved).
Crossmodal and incremental perception of audiovisual cues to emotional speech.
Barkhuysen, Pashiera; Krahmer, Emiel; Swerts, Marc
2010-01-01
In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: 1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? Both experiments reported below are based on tests with video clips of emotional utterances collected via a variant of the well-known Velten method. More specifically, we recorded speakers who displayed positive or negative emotions, which were congruent or incongruent with the (emotional) lexical content of the uttered sentence. In order to test this, we conducted two experiments. The first experiment is a perception experiment in which Czech participants, who do not speak Dutch, rate the perceived emotional state of Dutch speakers in a bimodal (audiovisual) or a unimodal (audio- or vision-only) condition. It was found that incongruent emotional speech leads to significantly more extreme perceived emotion scores than congruent emotional speech, where the difference between congruent and incongruent emotional speech is larger for the negative than for the positive conditions. Interestingly, the largest overall differences between congruent and incongruent emotions were found for the audio-only condition, which suggests that posing an incongruent emotion has a particularly strong effect on the spoken realization of emotions. The second experiment uses a gating paradigm to test the recognition speed for various emotional expressions from a speaker's face. In this experiment participants were presented with the same clips as experiment I, but this time presented vision-only. The clips were shown in successive segments (gates) of increasing duration. Results show that participants are surprisingly accurate in their recognition of the various emotions, as they already reach high recognition scores in the first gate (after only 160 ms). Interestingly, the recognition scores raise faster for positive than negative conditions. Finally, the gating results suggest that incongruent emotions are perceived as more intense than congruent emotions, as the former get more extreme recognition scores than the latter, already after a short period of exposure.
Recognition of emotions in autism: a formal meta-analysis.
Uljarevic, Mirko; Hamilton, Antonia
2013-07-01
Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants with autism. Results show there is an emotion recognition difficulty in autism, with a mean effect size of 0.80 which reduces to 0.41 when a correction for publication bias is applied. Recognition of happiness was only marginally impaired in autism, but recognition of fear was marginally worse than recognition of happiness. This meta-analysis provides an opportunity to survey the state of emotion recognition research in autism and to outline potential future directions.
Holding, Benjamin C; Laukka, Petri; Fischer, Håkan; Bänziger, Tanja; Axelsson, John; Sundelin, Tina
2017-11-01
Insufficient sleep has been associated with impaired recognition of facial emotions. However, previous studies have found inconsistent results, potentially stemming from the type of static picture task used. We therefore examined whether insufficient sleep was associated with decreased emotion recognition ability in two separate studies using a dynamic multimodal task. Study 1 used a cross-sectional design consisting of 291 participants with questionnaire measures assessing sleep duration and self-reported sleep quality for the previous night. Study 2 used an experimental design involving 181 participants where individuals were quasi-randomized into either a sleep-deprivation (N = 90) or a sleep-control (N = 91) condition. All participants from both studies were tested on the same forced-choice multimodal test of emotion recognition to assess the accuracy of emotion categorization. Sleep duration, self-reported sleep quality (study 1), and sleep deprivation (study 2) did not predict overall emotion recognition accuracy or speed. Similarly, the responses to each of the twelve emotions tested showed no evidence of impaired recognition ability, apart from one positive association suggesting that greater self-reported sleep quality could predict more accurate recognition of disgust (study 1). The studies presented here involve considerably larger samples than previous studies and the results support the null hypotheses. Therefore, we suggest that the ability to accurately categorize the emotions of others is not associated with short-term sleep duration or sleep quality and is resilient to acute periods of insufficient sleep. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.
ERIC Educational Resources Information Center
Wright, Barry; Clarke, Natalie; Jordan, Jo; Young, Andrew W.; Clarke, Paula; Miles, Jeremy; Nation, Kate; Clarke, Leesa; Williams, Christine
2008-01-01
We compared young people with high-functioning autism spectrum disorders (ASDs) with age, sex and IQ matched controls on emotion recognition of faces and pictorial context. Each participant completed two tests of emotion recognition. The first used Ekman series faces. The second used facial expressions in visual context. A control task involved…
Test-retest reliability and task order effects of emotional cognitive tests in healthy subjects.
Adams, Thomas; Pounder, Zoe; Preston, Sally; Hanson, Andy; Gallagher, Peter; Harmer, Catherine J; McAllister-Williams, R Hamish
2016-11-01
Little is known of the retest reliability of emotional cognitive tasks or the impact of using different tasks employing similar emotional stimuli within a battery. We investigated this in healthy subjects. We found improved overall performance in an emotional attentional blink task (EABT) with repeat testing at one hour and one week compared to baseline, but the impact of an emotional stimulus on performance was unchanged. Similarly, performance on a facial expression recognition task (FERT) was better one week after a baseline test, though the relative effect of specific emotions was unaltered. There was no effect of repeat testing on an emotional word categorising, recall and recognition task. We found no difference in performance in the FERT and EABT irrespective of task order. We concluded that it is possible to use emotional cognitive tasks in longitudinal studies and combine tasks using emotional facial stimuli in a single battery.
Basic and complex emotion recognition in children with autism: cross-cultural findings.
Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer
2016-01-01
Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.
ERIC Educational Resources Information Center
Golan, Ofer; Ashwin, Emma; Granader, Yael; McClintock, Suzy; Day, Kate; Leggett, Victoria; Baron-Cohen, Simon
2010-01-01
This study evaluated "The Transporters", an animated series designed to enhance emotion comprehension in children with autism spectrum conditions (ASC). n = 20 children with ASC (aged 4-7) watched "The Transporters" everyday for 4 weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three…
Castagna, Filomena; Montemagni, Cristiana; Maria Milani, Anna; Rocca, Giuseppe; Rocca, Paola; Casacchia, Massimo; Bogetto, Filippo
2013-02-28
This study aimed to evaluate the ability to decode emotion in the auditory and audiovisual modality in a group of patients with schizophrenia, and to explore the role of cognition and psychopathology in affecting these emotion recognition abilities. Ninety-four outpatients in a stable phase and 51 healthy subjects were recruited. Patients were assessed through a psychiatric evaluation and a wide neuropsychological battery. All subjects completed the comprehensive affect testing system (CATS), a group of computerized tests designed to evaluate emotion perception abilities. With respect to the controls, patients were not impaired in the CATS tasks involving discrimination of nonemotional prosody, naming of emotional stimuli expressed by voice and judging the emotional content of a sentence, whereas they showed a specific impairment in decoding emotion in a conflicting auditory condition and in the multichannel modality. Prosody impairment was affected by executive functions, attention and negative symptoms, while deficit in multisensory emotion recognition was affected by executive functions and negative symptoms. These emotion recognition deficits, rather than being associated purely with emotion perception disturbances in schizophrenia, are affected by core symptoms of the illness. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Emotion Recognition Pattern in Adolescent Boys with Attention-Deficit/Hyperactivity Disorder
Bozsik, Csilla; Gadoros, Julia; Inantsy-Pap, Judit
2014-01-01
Background. Social and emotional deficits were recently considered as inherent features of individuals with attention-deficit hyperactivity disorder (ADHD), but only sporadic literature data exist on emotion recognition in adolescents with ADHD. The aim of the present study was to establish emotion recognition profile in adolescent boys with ADHD in comparison with control adolescents. Methods. Forty-four adolescent boys (13–16 years) participated in the study after informed consent; 22 boys had a clinical diagnosis of ADHD, while data were also assessed from 22 adolescent control boys matched for age and Raven IQ. Parent- and self-reported behavioral characteristics were assessed by the means of the Strengths and Difficulties Questionnaire. The recognition of six basic emotions was evaluated by the “Facial Expressions of Emotion-Stimuli and Tests.” Results. Compared to controls, adolescents with ADHD were more sensitive in the recognition of disgust and, worse in the recognition of fear and showed a tendency for impaired recognition of sadness. Hyperactivity measures showed an inverse correlation with fear recognition. Conclusion. Our data suggest that adolescent boys with ADHD have alterations in the recognition of specific emotions. PMID:25110694
Remembering the snake in the grass: Threat enhances recognition but not source memory.
Meyer, Miriam Magdalena; Bell, Raoul; Buchner, Axel
2015-12-01
Research on the influence of emotion on source memory has yielded inconsistent findings. The object-based framework (Mather, 2007) predicts that negatively arousing stimuli attract attention, resulting in enhanced within-object binding, and, thereby, enhanced source memory for intrinsic context features of emotional stimuli. To test this prediction, we presented pictures of threatening and harmless animals, the color of which had been experimentally manipulated. In a memory test, old-new recognition for the animals and source memory for their color was assessed. In all 3 experiments, old-new recognition was better for the more threatening material, which supports previous reports of an emotional memory enhancement. This recognition advantage was due to the emotional properties of the stimulus material, and not specific for snake stimuli. However, inconsistent with the prediction of the object-based framework, intrinsic source memory was not affected by emotion. (c) 2015 APA, all rights reserved).
Chen, Jing; Hu, Bin; Wang, Yue; Moore, Philip; Dai, Yongqiang; Feng, Lei; Ding, Zhijie
2017-12-20
Collaboration between humans and computers has become pervasive and ubiquitous, however current computer systems are limited in that they fail to address the emotional component. An accurate understanding of human emotions is necessary for these computers to trigger proper feedback. Among multiple emotional channels, physiological signals are synchronous with emotional responses; therefore, analyzing physiological changes is a recognized way to estimate human emotions. In this paper, a three-stage decision method is proposed to recognize four emotions based on physiological signals in the multi-subject context. Emotion detection is achieved by using a stage-divided strategy in which each stage deals with a fine-grained goal. The decision method consists of three stages. During the training process, the initial stage transforms mixed training subjects to separate groups, thus eliminating the effect of individual differences. The second stage categorizes four emotions into two emotion pools in order to reduce recognition complexity. The third stage trains a classifier based on emotions in each emotion pool. During the testing process, a test case or test trial will be initially classified to a group followed by classification into an emotion pool in the second stage. An emotion will be assigned to the test trial in the final stage. In this paper we consider two different ways of allocating four emotions into two emotion pools. A comparative analysis is also carried out between the proposal and other methods. An average recognition accuracy of 77.57% was achieved on the recognition of four emotions with the best accuracy of 86.67% to recognize the positive and excited emotion. Using differing ways of allocating four emotions into two emotion pools, we found there is a difference in the effectiveness of a classifier on learning each emotion. When compared to other methods, the proposed method demonstrates a significant improvement in recognizing four emotions in the multi-subject context. The proposed three-stage decision method solves a crucial issue which is 'individual differences' in multi-subject emotion recognition and overcomes the suboptimal performance with respect to direct classification of multiple emotions. Our study supports the observation that the proposed method represents a promising methodology for recognizing multiple emotions in the multi-subject context.
False recognition of facial expressions of emotion: causes and implications.
Fernández-Dols, José-Miguel; Carrera, Pilar; Barchard, Kimberly A; Gacitua, Marta
2008-08-01
This article examines the importance of semantic processes in the recognition of emotional expressions, through a series of three studies on false recognition. The first study found a high frequency of false recognition of prototypical expressions of emotion when participants viewed slides and video clips of nonprototypical fearful and happy expressions. The second study tested whether semantic processes caused false recognition. The authors found that participants made significantly higher error rates when asked to detect expressions that corresponded to semantic labels than when asked to detect visual stimuli. Finally, given that previous research reported that false memories are less prevalent in younger children, the third study tested whether false recognition of prototypical expressions increased with age. The authors found that 67% of eight- to nine-year-old children reported nonpresent prototypical expressions of fear in a fearful context, but only 40% of 6- to 7-year-old children did so. Taken together, these three studies demonstrate the importance of semantic processes in the detection and categorization of prototypical emotional expressions.
Action and Emotion Recognition from Point Light Displays: An Investigation of Gender Differences
Alaerts, Kaat; Nackaerts, Evelien; Meyns, Pieter; Swinnen, Stephan P.; Wenderoth, Nicole
2011-01-01
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as ‘reading’ the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of ‘biological motion’ versus ‘non-biological’ (or ‘scrambled’ motion); or (ii) the recognition of the ‘emotional state’ of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the ‘Reading the Mind in the Eyes Test’ (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree – be related to more basic differences in processing biological motion per se. PMID:21695266
ERIC Educational Resources Information Center
Braun, M.; Traue, H.C.; Frisch, S.; Deighton, R.M.; Kessler, H.
2005-01-01
The aim of this study was to investigate the effect of a stroke event on people's ability to recognize basic emotions. In particular, the hypothesis that right brain-damaged (RBD) patients would show less of emotion recognition ability compared with left brain-damaged (LBD) patients and healthy controls, was tested. To investigate this the FEEL…
Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R
2011-11-01
Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.
Chiu, Isabelle; Gfrörer, Regina I; Piguet, Olivier; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
2015-08-01
The importance of including measures of emotion processing, such as tests of facial emotion recognition (FER), as part of a comprehensive neuropsychological assessment is being increasingly recognized. In clinical settings, FER tests need to be sensitive, short, and easy to administer, given the limited time available and patient limitations. Current tests, however, commonly use stimuli that either display prototypical emotions, bearing the risk of ceiling effects and unequal task difficulty, or are cognitively too demanding and time-consuming. To overcome these limitations in FER testing in patient populations, we aimed to define FER threshold levels for the six basic emotions in healthy individuals. Forty-nine healthy individuals between 52 and 79 years of age were asked to identify the six basic emotions at different intensity levels (25%, 50%, 75%, 100%, and 125% of the prototypical emotion). Analyses uncovered differing threshold levels across emotions and sex of facial stimuli, ranging from 50% up to 100% intensities. Using these findings as "healthy population benchmarks", we propose to apply these threshold levels to clinical populations either as facial emotion recognition or intensity rating tasks. As part of any comprehensive social cognition test battery, this approach should allow for a rapid and sensitive assessment of potential FER deficits.
Golan, Ofer; Baron-Cohen, Simon; Golan, Yael
2008-09-01
Children with autism spectrum conditions (ASC) have difficulties recognizing others' emotions. Research has mostly focused on basic emotion recognition, devoid of context. This study reports the results of a new task, assessing recognition of complex emotions and mental states in social contexts. An ASC group (n = 23) was compared to a general population control group (n = 24). Children with ASC performed lower than controls on the task. Using task scores, more than 87% of the participants were allocated to their group. This new test quantifies complex emotion and mental state recognition in life-like situations. Our findings reveal that children with ASC have residual difficulties in this aspect of empathy. The use of language-based compensatory strategies for emotion recognition is discussed.
Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela
2015-01-01
Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.
Facial emotion recognition is inversely correlated with tremor severity in essential tremor.
Auzou, Nicolas; Foubert-Samier, Alexandra; Dupouy, Sandrine; Meissner, Wassilios G
2014-04-01
We here assess limbic and orbitofrontal control in 20 patients with essential tremor (ET) and 18 age-matched healthy controls using the Ekman Facial Emotion Recognition Task and the IOWA Gambling Task. Our results show an inverse relation between facial emotion recognition and tremor severity. ET patients also showed worse performance in joy and fear recognition, as well as subtle abnormalities in risk detection, but these differences did not reach significance after correction for multiple testing.
Conduct symptoms and emotion recognition in adolescent boys with externalization problems.
Aspan, Nikoletta; Vida, Peter; Gadoros, Julia; Halasz, Jozsef
2013-01-01
In adults with antisocial personality disorder, marked alterations in the recognition of facial affect were described. Less consistent data are available on the emotion recognition in adolescents with externalization problems. The aim of the present study was to assess the relation between the recognition of emotions and conduct symptoms in adolescent boys with externalization problems. Adolescent boys with externalization problems referred to Vadaskert Child Psychiatry Hospital participated in the study after informed consent (N = 114, 11-17 years, mean = 13.4). The conduct problems scale of the strengths and difficulties questionnaire (parent and self-report) was used. The performance in a facial emotion recognition test was assessed. Conduct problems score (parent and self-report) was inversely correlated with the overall emotion recognition. In the self-report, conduct problems score was inversely correlated with the recognition of anger, fear, and sadness. Adolescents with high conduct problems scores were significantly worse in the recognition of fear, sadness, and overall recognition than adolescents with low conduct scores, irrespective of age and IQ. Our results suggest that impaired emotion recognition is dimensionally related to conduct problems and might have importance in the development of antisocial behavior.
ERIC Educational Resources Information Center
Cebula, Katie R.; Wishart, Jennifer G.; Willis, Diane S.; Pitcairn, Tom K.
2017-01-01
Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched,…
Halász, József; Áspán, Nikoletta; Bozsik, Csilla; Gádoros, Júlia; Inántsy-Pap, Judit
2013-01-01
In adult individuals with antisocial personality disorder, impairment in the recognition of fear seems established. In adolescents with conduct disorder (antecedent of antisocial personality disorder), only sporadic data were assessed, but literature data indicate alterations in the recognition of emotions. The aim of the present study was to assess the relationship between emotion recognition and conduct symptoms in non-clinical adolescents. 53 adolescents participated in the study (13-16 years, boys, n=29, age 14.7±0.2 years; girls, n=24, age=14.7±0.2 years) after informed consent. The parent version of the Strengths and Difficulties Questionnaire was used to assess behavioral problems. The recognition of six basic emotions was established by the "Facial expressions of emotion-stimuli and tests", while Raven IQ measures were also performed. Compared to boys, girls showed significantly better performance in the recognition of disgust (p<0.035), while no significant difference occurred in the recognition of other emotions. In boys, Conduct Problems score was inversely correlated with the recognition of fear (Spearman R=-0.40, p<0.031) and overall emotion recognition (Spearman R=-0.44, p<0.015), while similar correlation was not present in girls. The relationship between the recognition of emotions and conduct problems might indicate an important mechanism in the development of antisocial behavior.
Mood-congruent false memories persist over time.
Knott, Lauren M; Thorley, Craig
2014-01-01
In this study, we examined the role of mood-congruency and retention interval on the false recognition of emotion laden items using the Deese/Roediger-McDermott (DRM) paradigm. Previous research has shown a mood-congruent false memory enhancement during immediate recognition tasks. The present study examined the persistence of this effect following a one-week delay. Participants were placed in a negative or neutral mood, presented with negative-emotion and neutral-emotion DRM word lists, and administered with both immediate and delayed recognition tests. Results showed that a negative mood state increased remember judgments for negative-emotion critical lures, in comparison to neutral-emotion critical lures, on both immediate and delayed testing. These findings are discussed in relation to theories of spreading activation and emotion-enhanced memory, with consideration of the applied forensic implications of such findings.
Oxytocin improves emotion recognition for older males.
Campbell, Anna; Ruffman, Ted; Murray, Janice E; Glue, Paul
2014-10-01
Older adults (≥60 years) perform worse than young adults (18-30 years) when recognizing facial expressions of emotion. The hypothesized cause of these changes might be declines in neurotransmitters that could affect information processing within the brain. In the present study, we examined the neuropeptide oxytocin that functions to increase neurotransmission. Research suggests that oxytocin benefits the emotion recognition of less socially able individuals. Men tend to have lower levels of oxytocin and older men tend to have worse emotion recognition than older women; therefore, there is reason to think that older men will be particularly likely to benefit from oxytocin. We examined this idea using a double-blind design, testing 68 older and 68 young adults randomly allocated to receive oxytocin nasal spray (20 international units) or placebo. Forty-five minutes afterward they completed an emotion recognition task assessing labeling accuracy for angry, disgusted, fearful, happy, neutral, and sad faces. Older males receiving oxytocin showed improved emotion recognition relative to those taking placebo. No differences were found for older females or young adults. We hypothesize that oxytocin facilitates emotion recognition by improving neurotransmission in the group with the worst emotion recognition. Copyright © 2014 Elsevier Inc. All rights reserved.
Romero, Neri L
2017-06-01
A common social impairment in individuals with ASD is difficulty interpreting and or predicting emotions of others. To date, several interventions targeting teaching emotion recognition and understanding have been utilized both by researchers and practitioners. The results suggest that teaching emotion recognition is possible, but that the results do not generalize to non-instructional contexts. This study sought to replicate earlier findings of a positive impact of teaching emotion recognition using a computer-based intervention and to extend it by testing for generalization on live models in the classroom setting. Two boys and one girl, four to eight years in age, educated in self-contained classrooms for students with communication and social skills deficits, participated in this study. A multiple probe across participants design was utilized. Measures of emotion recognition and understanding were assessed at baseline, intervention, and one month post-intervention to determine maintenance effects. Social validity was assessed through parent and teacher questionnaires. All participants showed improvements in measures assessing their recognition of emotions in faces, generalized knowledge to live models, and maintained gains one month post intervention. These preliminary results are encouraging and should be utilized to inform a group design, in order to test efficacy with a larger population. Copyright © 2017 Elsevier Ltd. All rights reserved.
Drapier, D; Péron, J; Leray, E; Sauleau, P; Biseul, I; Drapier, S; Le Jeune, F; Travers, D; Bourguignon, A; Haegelen, C; Millet, B; Vérin, M
2008-09-01
To test the hypothesis that emotion recognition and apathy share the same functional circuit involving the subthalamic nucleus (STN). A consecutive series of 17 patients with advanced Parkinson's disease (PD) was assessed 3 months before (M-3) and 3 months (M+3) after STN deep brain stimulation (DBS). Mean (+/-S.D.) age at surgery was 56.9 (8.7) years. Mean disease duration at surgery was 11.8 (2.6) years. Apathy was measured using the Apathy Evaluation Scale (AES) at both M-3 and M3. Patients were also assessed using a computerised paradigm of facial emotion recognition [Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologist Press] before and after STN DBS. Prior to this, the Benton Facial Recognition Test was used to check that the ability to perceive faces was intact. Apathy had significantly worsened at M3 (42.5+/-8.9, p=0.006) after STN-DBS, in relation to the preoperative assessment (37.2+/-5.5). There was also a significant reduction in recognition percentages for facial expressions of fear (43.1%+/-22.9 vs. 61.6%+/-21.4, p=0.022) and sadness (52.7%+/-19.1 vs. 67.6%+/-22.8, p=0.031) after STN DBS. However, the postoperative worsening of apathy and emotion recognition impairment were not correlated. Our results confirm that the STN is involved in both the apathy and emotion recognition networks. However, the absence of any correlation between apathy and emotion recognition impairment suggests that the worsening of apathy following surgery could not be explained by a lack of facial emotion recognition and that its behavioural and cognitive components should therefore also be taken into consideration.
Emotional facial recognition in proactive and reactive violent offenders.
Philipp-Wiegmann, Florence; Rösler, Michael; Retz-Junginger, Petra; Retz, Wolfgang
2017-10-01
The purpose of this study is to analyse individual differences in the ability of emotional facial recognition in violent offenders, who were characterised as either reactive or proactive in relation to their offending. In accordance with findings of our previous study, we expected higher impairments in facial recognition in reactive than proactive violent offenders. To assess the ability to recognize facial expressions, the computer-based Facial Emotional Expression Labeling Test (FEEL) was performed. Group allocation of reactive und proactive violent offenders and assessment of psychopathic traits were performed by an independent forensic expert using rating scales (PROREA, PCL-SV). Compared to proactive violent offenders and controls, the performance of emotion recognition in the reactive offender group was significantly lower, both in total and especially in recognition of negative emotions such as anxiety (d = -1.29), sadness (d = -1.54), and disgust (d = -1.11). Furthermore, reactive violent offenders showed a tendency to interpret non-anger emotions as anger. In contrast, proactive violent offenders performed as well as controls. General and specific deficits in reactive violent offenders are in line with the results of our previous study and correspond to predictions of the Integrated Emotion System (IES, 7) and the hostile attribution processes (21). Due to the different error pattern in the FEEL test, the theoretical distinction between proactive and reactive aggression can be supported based on emotion recognition, even though aggression itself is always a heterogeneous act rather than a distinct one-dimensional concept.
Silver, Henry; Bilker, Warren B
2015-01-01
Social cognition is commonly assessed by identification of emotions in facial expressions. Presence of colour, a salient feature of stimuli, might influence emotional face perception. We administered 2 tests of facial emotion recognition, the Emotion Recognition Test (ER40) using colour pictures and the Penn Emotional Acuity Test using monochromatic pictures, to 37 young healthy, 39 old healthy and 37 schizophrenic men. Among young healthy individuals recognition of emotions was more accurate and faster in colour than in monochromatic pictures. Compared to the younger group, older healthy individuals revealed impairment in identification of sad expressions in colour but not monochromatic pictures. Schizophrenia patients showed greater impairment in colour than monochromatic pictures of neutral and sad expressions and overall total score compared to both healthy groups. Patients showed significant correlations between cognitive impairment and perception of emotion in colour but not monochromatic pictures. Colour enhances perception of general emotional clues and this contextual effect is impaired in healthy ageing and schizophrenia. The effects of colour need to be considered in interpreting and comparing studies of emotion perception. Coloured face stimuli may be more sensitive to emotion processing impairments but less selective for emotion-specific information than monochromatic stimuli. This may impact on their utility in early detection of impairments and investigations of underlying mechanisms.
Emotional memory for musical excerpts in young and older adults
Alonso, Irene; Dellacherie, Delphine; Samson, Séverine
2015-01-01
The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009), but it remains unclear whether positively (Eschrich et al., 2008) or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013) may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these issues, we tested emotional memory for music in young and older adults using musical excerpts varying in terms of arousal and valence. Participants completed immediate and 24 h delayed recognition tests. We predicted highly arousing excerpts to be better recognized by both groups in immediate recognition. We hypothesized that arousal may compensate consolidation deficits in aging, thus showing more prominent benefit of high over low arousing stimuli in older than younger adults on delayed recognition. We also hypothesized worst retention of negative excerpts for the older group, resulting in a recognition benefit for positive over negative excerpts specific to older adults. Our results suggest that although older adults had worse recognition than young adults overall, effects of emotion on memory do not seem to be modified by aging. Results on immediate recognition suggest that recognition of low arousing excerpts can be affected by valence, with better memory for positive relative to negative low arousing music. However, 24 h delayed recognition results demonstrate effects of emotion on memory consolidation regardless of age, with a recognition benefit for high arousal and for negatively valenced music. The present study highlights the role of emotion on memory consolidation. Findings are examined in light of the literature on emotional memory for music and for other stimuli. We finally discuss the implication of the present results for potential music interventions in aging and dementia. PMID:25814950
Specific Impairments in the Recognition of Emotional Facial Expressions in Parkinson’s Disease
Clark, Uraina S.; Neargarder, Sandy; Cronin-Golomb, Alice
2008-01-01
Studies investigating the ability to recognize emotional facial expressions in non-demented individuals with Parkinson’s disease (PD) have yielded equivocal findings. A possible reason for this variability may lie in the confounding of emotion recognition with cognitive task requirements, a confound arising from the lack of a control condition using non-emotional stimuli. The present study examined emotional facial expression recognition abilities in 20 non-demented patients with PD and 23 control participants relative to their performances on a non-emotional landscape categorization test with comparable task requirements. We found that PD participants were normal on the control task but exhibited selective impairments in the recognition of facial emotion, specifically for anger (driven by those with right hemisphere pathology) and surprise (driven by those with left hemisphere pathology), even when controlling for depression level. Male but not female PD participants further displayed specific deficits in the recognition of fearful expressions. We suggest that the neural substrates that may subserve these impairments include the ventral striatum, amygdala, and prefrontal cortices. Finally, we observed that in PD participants, deficiencies in facial emotion recognition correlated with higher levels of interpersonal distress, which calls attention to the significant psychosocial impact that facial emotion recognition impairments may have on individuals with PD. PMID:18485422
Impact of severity of drug use on discrete emotions recognition in polysubstance abusers.
Fernández-Serrano, María José; Lozano, Oscar; Pérez-García, Miguel; Verdejo-García, Antonio
2010-06-01
Neuropsychological studies support the association between severity of drug intake and alterations in specific cognitive domains and neural systems, but there is disproportionately less research on the neuropsychology of emotional alterations associated with addiction. One of the key aspects of adaptive emotional functioning potentially relevant to addiction progression and treatment is the ability to recognize basic emotions in the faces of others. Therefore, the aims of this study were: (i) to examine facial emotion recognition in abstinent polysubstance abusers, and (ii) to explore the association between patterns of quantity and duration of use of several drugs co-abused (including alcohol, cannabis, cocaine, heroin and MDMA) and the ability to identify discrete facial emotional expressions portraying basic emotions. We compared accuracy of emotion recognition of facial expressions portraying six basic emotions (measured with the Ekman Faces Test) between polysubstance abusers (PSA, n=65) and non-drug using comparison individuals (NDCI, n=30), and used regression models to explore the association between quantity and duration of use of the different drugs co-abused and indices of recognition of each of the six emotions, while controlling for relevant socio-demographic and affect-related confounders. Results showed: (i) that PSA had significantly poorer recognition than NDCI for facial expressions of anger, disgust, fear and sadness; (ii) that measures of quantity and duration of drugs used significantly predicted poorer discrete emotions recognition: quantity of cocaine use predicted poorer anger recognition, and duration of cocaine use predicted both poorer anger and fear recognition. Severity of cocaine use also significantly predicted overall recognition accuracy. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Anagnostopoulos, Christos Nikolaos; Vovoli, Eftichia
An emotion recognition framework based on sound processing could improve services in human-computer interaction. Various quantitative speech features obtained from sound processing of acting speech were tested, as to whether they are sufficient or not to discriminate between seven emotions. Multilayered perceptrons were trained to classify gender and emotions on the basis of a 24-input vector, which provide information about the prosody of the speaker over the entire sentence using statistics of sound features. Several experiments were performed and the results were presented analytically. Emotion recognition was successful when speakers and utterances were “known” to the classifier. However, severe misclassifications occurred during the utterance-independent framework. At least, the proposed feature vector achieved promising results for utterance-independent recognition of high- and low-arousal emotions.
Emotion recognition in Parkinson's disease: Static and dynamic factors.
Wasser, Cory I; Evans, Felicity; Kempnich, Clare; Glikmann-Johnston, Yifat; Andrews, Sophie C; Thyagarajan, Dominic; Stout, Julie C
2018-02-01
The authors tested the hypothesis that Parkinson's disease (PD) participants would perform better in an emotion recognition task with dynamic (video) stimuli compared to a task using only static (photograph) stimuli and compared performances on both tasks to healthy control participants. In a within-subjects study, 21 PD participants and 20 age-matched healthy controls performed both static and dynamic emotion recognition tasks. The authors used a 2-way analysis of variance (controlling for individual participant variance) to determine the effect of group (PD, control) on emotion recognition performance in static and dynamic facial recognition tasks. Groups did not significantly differ in their performances on the static and dynamic tasks; however, the trend was suggestive that PD participants performed worse than controls. PD participants may have subtle emotion recognition deficits that are not ameliorated by the addition of contextual cues, similar to those found in everyday scenarios. Consistent with previous literature, the results suggest that PD participants may have underlying emotion recognition deficits, which may impact their social functioning. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Gender interactions in the recognition of emotions and conduct symptoms in adolescents.
Halász, József; Aspán, Nikoletta; Bozsik, Csilla; Gádoros, Júlia; Inántsy-Pap, Judit
2014-01-01
According to literature data, impairment in the recognition of emotions might be related to antisocial developmental pathway. In the present study, the relationship between gender-specific interaction of emotion recognition and conduct symptoms were studied in non-clinical adolescents. After informed consent, 29 boys and 24 girls (13-16 years, 14 ± 0.1 years) participated in the study. The parent version of the Strengths and Difficulties Questionnaire was used to assess behavioral problems. The recognition of basic emotions was analyzed according to both the gender of the participants and the gender of the stimulus faces via the "Facial Expressions of Emotion- Stimuli and Tests". Girls were significantly better than boys in the recognition of disgust, irrespective from the gender of the stimulus faces, albeit both genders were significantly better in the recognition of disgust in the case of male stimulus faces compared to female stimulus faces. Both boys and girls were significantly better in the recognition of sadness in the case of female stimulus faces compared to male stimulus faces. There was no gender effect (neither participant nor stimulus faces) in the recognition of other emotions. Conduct scores in boys were inversely correlated with the recognition of fear in male stimulus faces (R=-0.439, p<0.05) and with overall emotion recognition in male stimulus faces (R=-0.558, p<0.01). In girls, conduct scores were shown a tendency for positive correlation with disgust recognition in female stimulus faces (R=0.376, p<0.07). A gender-specific interaction between the recognition of emotions and antisocial developmentalpathway is suggested.
Test battery for measuring the perception and recognition of facial expressions of emotion
Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner
2014-01-01
Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528
Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk
2017-07-01
This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.
Baran Tatar, Zeynep; Yargıç, İlhan; Oflaz, Serap; Büyükgök, Deniz
2015-01-01
Interpersonal relationship disorders in adults with Attention Deficit Hyperactivity Disorder (ADHD) can be associated with the impairment of non-verbal communication. The purpose of our study was to compare the emotion recognition, facial recognition and neuropsychological assessments of adult ADHD patients with those of healthy controls, and to thus determine the effect of neuropsychological data on the recognition of emotional expressions. This study, which was based on a case-control model, was conducted with patients diagnosed with ADHD according to the DSM-IV-TR, being followed and monitored at the adult ADHD clinic of the Psychiatry Department of the Istanbul University Istanbul Medical Faculty Hospital. The study group consisted of 40 adults (27.5% female) between the ages of 20-65 (mean age 25.96 ± 6.07; education level: 15.02±2.34 years) diagnosed with ADHD, and 40 controls who were matched/similar with the study group with respect to age, gender, and education level. In the ADHD group, 14 (35%) of the patients had concomitant diseases. Pictures of Facial Affect, the Benton Face Recognition Test, and the Continuous Performance Test were used to respectively evaluate emotion recognition, facial recognition, and attention deficit and impulsivity of the patients. It was determined that, in comparison to the control group, the ADHD group made more mistakes in recognizing all types of emotional expressions and neutral expressions. The ADHD group also demonstrated more cognitive mistakes. Facial recognition was similar in both groups. It was determined that impulsivity had a significant effect on facial recognition. The social relationship disorders observed in ADHD can be affected by emotion recognition processes. In future studies, it may be possible to investigate the effects that early psychopharmacological and psychotherapeutic interventions administered for the main symptoms of ADHD have on the impairment of emotion recognition.
Gender differences in the relationship between social communication and emotion recognition.
Kothari, Radha; Skuse, David; Wakefield, Justin; Micali, Nadia
2013-11-01
To investigate the association between autistic traits and emotion recognition in a large community sample of children using facial and social motion cues, additionally stratifying by gender. A general population sample of 3,666 children from the Avon Longitudinal Study of Parents and Children (ALSPAC) were assessed on their ability to correctly recognize emotions using the faces subtest of the Diagnostic Analysis of Non-Verbal Accuracy, and the Emotional Triangles Task, a novel test assessing recognition of emotion from social motion cues. Children with autistic-like social communication difficulties, as assessed by the Social Communication Disorders Checklist, were compared with children without such difficulties. Autistic-like social communication difficulties were associated with poorer recognition of emotion from social motion cues in both genders, but were associated with poorer facial emotion recognition in boys only (odds ratio = 1.9, 95% CI = 1.4, 2.6, p = .0001). This finding must be considered in light of lower power to detect differences in girls. In this community sample of children, greater deficits in social communication skills are associated with poorer discrimination of emotions, implying there may be an underlying continuum of liability to the association between these characteristics. As a similar degree of association was observed in both genders on a novel test of social motion cues, the relatively good performance of girls on the more familiar task of facial emotion discrimination may be due to compensatory mechanisms. Our study might indicate the existence of a cognitive process by which girls with underlying autistic traits can compensate for their covert deficits in emotion recognition, although this would require further investigation. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Abnormal Facial Emotion Recognition in Depression: Serial Testing in an Ultra-Rapid-Cycling Patient.
ERIC Educational Resources Information Center
George, Mark S.; Huggins, Teresa; McDermut, Wilson; Parekh, Priti I.; Rubinow, David; Post, Robert M.
1998-01-01
Mood disorder subjects have a selective deficit in recognizing human facial emotion. Whether the facial emotion recognition errors persist during normal mood states (i.e., are state vs. trait dependent) was studied in one male bipolar II patient. Results of five sessions are presented and discussed. (Author/EMK)
Liberal Bias Mediates Emotion Recognition Deficits in Frontal Traumatic Brain Injury
ERIC Educational Resources Information Center
Callahan, Brandy L.; Ueda, Keita; Sakata, Daisuke; Plamondon, Andre; Murai, Toshiya
2011-01-01
It is well-known that patients having sustained frontal-lobe traumatic brain injury (TBI) are severely impaired on tests of emotion recognition. Indeed, these patients have significant difficulty recognizing facial expressions of emotion, and such deficits are often associated with decreased social functioning and poor quality of life. As of yet,…
Recognizing Emotions: Testing an Intervention for Children with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Richard, Donna Abely; More, William; Joy, Stephen P.
2015-01-01
A severely impaired capacity for social interaction is one of the characteristics of individuals with autism spectrum disorder (ASD). Deficits in facial emotional recognition processing may be associated with this limitation. The Build-a-Face (BAF) art therapy intervention was developed to assist with emotional recognition through the viewing and…
A Diffusion Model Analysis of Decision Biases Affecting Delayed Recognition of Emotional Stimuli.
Bowen, Holly J; Spaniol, Julia; Patel, Ronak; Voss, Andreas
2016-01-01
Previous empirical work suggests that emotion can influence accuracy and cognitive biases underlying recognition memory, depending on the experimental conditions. The current study examines the effects of arousal and valence on delayed recognition memory using the diffusion model, which allows the separation of two decision biases thought to underlie memory: response bias and memory bias. Memory bias has not been given much attention in the literature but can provide insight into the retrieval dynamics of emotion modulated memory. Participants viewed emotional pictorial stimuli; half were given a recognition test 1-day later and the other half 7-days later. Analyses revealed that emotional valence generally evokes liberal responding, whereas high arousal evokes liberal responding only at a short retention interval. The memory bias analyses indicated that participants experienced greater familiarity with high-arousal compared to low-arousal items and this pattern became more pronounced as study-test lag increased; positive items evoke greater familiarity compared to negative and this pattern remained stable across retention interval. The findings provide insight into the separate contributions of valence and arousal to the cognitive mechanisms underlying delayed emotion modulated memory.
Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick
2014-11-01
Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.
Brain correlates of musical and facial emotion recognition: evidence from the dementias.
Hsieh, S; Hornberger, M; Piguet, O; Hodges, J R
2012-07-01
The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language. Copyright © 2012 Elsevier Ltd. All rights reserved.
Ruocco, Anthony C.; Reilly, James L.; Rubin, Leah H.; Daros, Alex R.; Gershon, Elliot S.; Tamminga, Carol A.; Pearlson, Godfrey D.; Hill, S. Kristian; Keshavan, Matcheri S.; Gur, Ruben C.; Sweeney, John A.
2014-01-01
Background Difficulty recognizing facial emotions is an important social-cognitive deficit associated with psychotic disorders. It also may reflect a familial risk for psychosis in schizophrenia-spectrum disorders and bipolar disorder. Objective The objectives of this study from the Bipolar-Schizophrenia Network on Intermediate Phenotypes (B-SNIP) consortium were to: 1) compare emotion recognition deficits in schizophrenia, schizoaffective disorder and bipolar disorder with psychosis, 2) determine the familiality of emotion recognition deficits across these disorders, and 3) evaluate emotion recognition deficits in nonpsychotic relatives with and without elevated Cluster A and Cluster B personality disorder traits. Method Participants included probands with schizophrenia (n=297), schizoaffective disorder (depressed type, n=61; bipolar type, n=69), bipolar disorder with psychosis (n=248), their first-degree relatives (n=332, n=69, n=154, and n=286, respectively) and healthy controls (n=380). All participants completed the Penn Emotion Recognition Test, a standardized measure of facial emotion recognition assessing four basic emotions (happiness, sadness, anger and fear) and neutral expressions (no emotion). Results Compared to controls, emotion recognition deficits among probands increased progressively from bipolar disorder to schizoaffective disorder to schizophrenia. Proband and relative groups showed similar deficits perceiving angry and neutral faces, whereas deficits on fearful, happy and sad faces were primarily isolated to schizophrenia probands. Even non-psychotic relatives without elevated Cluster A or Cluster B personality disorder traits showed deficits on neutral and angry faces. Emotion recognition ability was moderately familial only in schizophrenia families. Conclusions Emotion recognition deficits are prominent but somewhat different across psychotic disorders. These deficits are reflected to a lesser extent in relatives, particularly on angry and neutral faces. Deficits were evident in non-psychotic relatives even without elevated personality disorder traits. Deficits in facial emotion recognition may reflect an important social-cognitive deficit in patients with psychotic disorders. PMID:25052782
Ruocco, Anthony C; Reilly, James L; Rubin, Leah H; Daros, Alex R; Gershon, Elliot S; Tamminga, Carol A; Pearlson, Godfrey D; Hill, S Kristian; Keshavan, Matcheri S; Gur, Ruben C; Sweeney, John A
2014-09-01
Difficulty recognizing facial emotions is an important social-cognitive deficit associated with psychotic disorders. It also may reflect a familial risk for psychosis in schizophrenia-spectrum disorders and bipolar disorder. The objectives of this study from the Bipolar-Schizophrenia Network on Intermediate Phenotypes (B-SNIP) consortium were to: 1) compare emotion recognition deficits in schizophrenia, schizoaffective disorder and bipolar disorder with psychosis, 2) determine the familiality of emotion recognition deficits across these disorders, and 3) evaluate emotion recognition deficits in nonpsychotic relatives with and without elevated Cluster A and Cluster B personality disorder traits. Participants included probands with schizophrenia (n=297), schizoaffective disorder (depressed type, n=61; bipolar type, n=69), bipolar disorder with psychosis (n=248), their first-degree relatives (n=332, n=69, n=154, and n=286, respectively) and healthy controls (n=380). All participants completed the Penn Emotion Recognition Test, a standardized measure of facial emotion recognition assessing four basic emotions (happiness, sadness, anger and fear) and neutral expressions (no emotion). Compared to controls, emotion recognition deficits among probands increased progressively from bipolar disorder to schizoaffective disorder to schizophrenia. Proband and relative groups showed similar deficits perceiving angry and neutral faces, whereas deficits on fearful, happy and sad faces were primarily isolated to schizophrenia probands. Even non-psychotic relatives without elevated Cluster A or Cluster B personality disorder traits showed deficits on neutral and angry faces. Emotion recognition ability was moderately familial only in schizophrenia families. Emotion recognition deficits are prominent but somewhat different across psychotic disorders. These deficits are reflected to a lesser extent in relatives, particularly on angry and neutral faces. Deficits were evident in non-psychotic relatives even without elevated personality disorder traits. Deficits in facial emotion recognition may reflect an important social-cognitive deficit in patients with psychotic disorders. Copyright © 2014 Elsevier B.V. All rights reserved.
Theory of mind and recognition of facial emotion in dementia: challenge to current concepts.
Freedman, Morris; Binns, Malcolm A; Black, Sandra E; Murphy, Cara; Stuss, Donald T
2013-01-01
Current literature suggests that theory of mind (ToM) and recognition of facial emotion are impaired in behavioral variant frontotemporal dementia (bvFTD). In contrast, studies suggest that ToM is spared in Alzheimer disease (AD). However, there is controversy whether recognition of emotion in faces is impaired in AD. This study challenges the concepts that ToM is preserved in AD and that recognition of facial emotion is impaired in bvFTD. ToM, recognition of facial emotion, and identification of emotions associated with video vignettes were studied in bvFTD, AD, and normal controls. ToM was assessed using false-belief and visual perspective-taking tasks. Identification of facial emotion was tested using Ekman and Friesen's pictures of facial affect. After adjusting for relevant covariates, there were significant ToM deficits in bvFTD and AD compared with controls, whereas neither group was impaired in the identification of emotions associated with video vignettes. There was borderline impairment in recognizing angry faces in bvFTD. Patients with AD showed significant deficits on false belief and visual perspective taking, and bvFTD patients were impaired on second-order false belief. We report novel findings challenging the concepts that ToM is spared in AD and that recognition of facial emotion is impaired in bvFTD.
Kadak, Muhammed Tayyib; Demirel, Omer Faruk; Yavuz, Mesut; Demir, Türkay
2014-07-01
Research findings debate about features of broad autism phenotype. In this study, we tested whether parents of children with autism have problems recognizing emotional facial expression and the contribution of such an impairment to the broad phenotype of autism. Seventy-two parents of children with autistic spectrum disorder and 38 parents of control group participated in the study. Broad autism features was measured with Autism Quotient (AQ). Recognition of Emotional Face Expression Test was assessed with the Emotion Recognition Test, consisting a set of photographs from Ekman & Friesen's. In a two-tailed analysis of variance of AQ, there was a significant difference for social skills (F(1, 106)=6.095; p<.05). Analyses of variance revealed significant difference in the recognition of happy, surprised and neutral expressions (F(1, 106)=4.068, p=.046; F(1, 106)=4.068, p=.046; F(1, 106)=6.064, p=.016). According to our findings, social impairment could be considered a characteristic feature of BAP. ASD parents had difficulty recognizing neutral expressions, suggesting that ASD parents may have impaired recognition of ambiguous expressions as do autistic children. Copyright © 2014 Elsevier Inc. All rights reserved.
Emotion Recognition in Children and Adolescents with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Kuusikko, Sanna; Haapsamo, Helena; Jansson-Verkasalo, Eira; Hurtig, Tuula; Mattila, Marja-Leena; Ebeling, Hanna; Jussila, Katja; Bolte, Sven; Moilanen, Irma
2009-01-01
We examined upper facial basic emotion recognition in 57 subjects with autism spectrum disorders (ASD) (M = 13.5 years) and 33 typically developing controls (M = 14.3 years) by using a standardized computer-aided measure (The Frankfurt Test and Training of Facial Affect Recognition, FEFA). The ASD group scored lower than controls on the total…
Differential effects of MDMA and methylphenidate on social cognition.
Schmid, Yasmin; Hysek, Cédric M; Simmler, Linda D; Crockett, Molly J; Quednow, Boris B; Liechti, Matthias E
2014-09-01
Social cognition is important in everyday-life social interactions. The social cognitive effects of 3,4-methylenedioxymethamphetamine (MDMA, 'ecstasy') and methylphenidate (both used for neuroenhancement and as party drugs) are largely unknown. We investigated the acute effects of MDMA (75 mg), methylphenidate (40 mg) and placebo using the Facial Emotion Recognition Task, Multifaceted Empathy Test, Movie for the Assessment of Social Cognition, Social Value Orientation Test and the Moral Judgment Task in a cross-over study in 30 healthy subjects. Additionally, subjective, autonomic, pharmacokinetic, endocrine and adverse drug effects were measured. MDMA enhanced emotional empathy for positive emotionally charged situations in the MET and tended to reduce the recognition of sad faces in the Facial Emotion Recognition Task. MDMA had no effects on cognitive empathy in the Multifaceted Empathy Test or social cognitive inferences in the Movie for the Assessment of Social Cognition. MDMA produced subjective 'empathogenic' effects, such as drug liking, closeness to others, openness and trust. In contrast, methylphenidate lacked such subjective effects and did not alter emotional processing, empathy or mental perspective-taking. MDMA but not methylphenidate increased the plasma levels of oxytocin and prolactin. None of the drugs influenced moral judgment. Effects on emotion recognition and emotional empathy were evident at a low dose of MDMA and likely contribute to the popularity of the drug. © The Author(s) 2014.
An Investigation of Emotion Recognition and Theory of Mind in People with Chronic Heart Failure
Habota, Tina; McLennan, Skye N.; Cameron, Jan; Ski, Chantal F.; Thompson, David R.; Rendell, Peter G.
2015-01-01
Objectives Cognitive deficits are common in patients with chronic heart failure (CHF), but no study has investigated whether these deficits extend to social cognition. The present study provided the first empirical assessment of emotion recognition and theory of mind (ToM) in patients with CHF. In addition, it assessed whether each of these social cognitive constructs was associated with more general cognitive impairment. Methods A group comparison design was used, with 31 CHF patients compared to 38 demographically matched controls. The Ekman Faces test was used to assess emotion recognition, and the Mind in the Eyes test to measure ToM. Measures assessing global cognition, executive functions, and verbal memory were also administered. Results There were no differences between groups on emotion recognition or ToM. The CHF group’s performance was poorer on some executive measures, but memory was relatively preserved. In the CHF group, both emotion recognition performance and ToM ability correlated moderately with global cognition (r = .38, p = .034; r = .49, p = .005, respectively), but not with executive function or verbal memory. Conclusion CHF patients with lower cognitive ability were more likely to have difficulty recognizing emotions and inferring the mental states of others. Clinical implications of these findings are discussed. PMID:26529409
Falkmer, Marita; Black, Melissa; Tang, Julia; Fitzgerald, Patrick; Girdler, Sonya; Leung, Denise; Ordqvist, Anna; Tan, Tele; Jahan, Ishrat; Falkmer, Torbjorn
2016-01-01
While local bias in visual processing in children with autism spectrum disorders (ASD) has been reported to result in difficulties in recognizing faces and facially expressed emotions, but superior ability in disembedding figures, associations between these abilities within a group of children with and without ASD have not been explored. Possible associations in performance on the Visual Perception Skills Figure-Ground test, a face recognition test and an emotion recognition test were investigated within 25 8-12-years-old children with high-functioning autism/Asperger syndrome, and in comparison to 33 typically developing children. Analyses indicated a weak positive correlation between accuracy in Figure-Ground recognition and emotion recognition. No other correlation estimates were significant. These findings challenge both the enhanced perceptual function hypothesis and the weak central coherence hypothesis, and accentuate the importance of further scrutinizing the existance and nature of local visual bias in ASD.
Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.
Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun
2016-07-01
The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.
Limbrecht-Ecklundt, Kerstin; Scheck, Andreas; Jerg-Bretzke, Lucia; Walter, Steffen; Hoffmann, Holger; Traue, Harald C.
2013-01-01
Objective: This article includes the examination of potential methodological problems of the application of a forced choice response format in facial emotion recognition. Methodology: 33 subjects were presented with validated facial stimuli. The task was to make a decision about which emotion was shown. In addition, the subjective certainty concerning the decision was recorded. Results: The detection rates are 68% for fear, 81% for sadness, 85% for anger, 87% for surprise, 88% for disgust, and 94% for happiness, and are thus well above the random probability. Conclusion: This study refutes the concern that the use of forced choice formats may not adequately reflect actual recognition performance. The use of standardized tests to examine emotion recognition ability leads to valid results and can be used in different contexts. For example, the images presented here appear suitable for diagnosing deficits in emotion recognition in the context of psychological disorders and for mapping treatment progress. PMID:23798981
LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality
Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E
2016-01-01
Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy. PMID:27249781
LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality.
Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E
2016-10-01
Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy.
Familiarity and face emotion recognition in patients with schizophrenia.
Lahera, Guillermo; Herrera, Sara; Fernández, Cristina; Bardón, Marta; de los Ángeles, Victoria; Fernández-Liria, Alberto
2014-01-01
To assess the emotion recognition in familiar and unknown faces in a sample of schizophrenic patients and healthy controls. Face emotion recognition of 18 outpatients diagnosed with schizophrenia (DSM-IVTR) and 18 healthy volunteers was assessed with two Emotion Recognition Tasks using familiar faces and unknown faces. Each subject was accompanied by 4 familiar people (parents, siblings or friends), which were photographed by expressing the 6 Ekman's basic emotions. Face emotion recognition in familiar faces was assessed with this ad hoc instrument. In each case, the patient scored (from 1 to 10) the subjective familiarity and affective valence corresponding to each person. Patients with schizophrenia not only showed a deficit in the recognition of emotions on unknown faces (p=.01), but they also showed an even more pronounced deficit on familiar faces (p=.001). Controls had a similar success rate in the unknown faces task (mean: 18 +/- 2.2) and the familiar face task (mean: 17.4 +/- 3). However, patients had a significantly lower score in the familiar faces task (mean: 13.2 +/- 3.8) than in the unknown faces task (mean: 16 +/- 2.4; p<.05). In both tests, the highest number of errors was with emotions of anger and fear. Subjectively, the patient group showed a lower level of familiarity and emotional valence to their respective relatives (p<.01). The sense of familiarity may be a factor involved in the face emotion recognition and it may be disturbed in schizophrenia. © 2013.
Memory and event-related potentials for rapidly presented emotional pictures.
Versace, Francesco; Bradley, Margaret M; Lang, Peter J
2010-08-01
Dense array event-related potentials (ERPs) and memory performance were assessed following rapid serial visual presentation (RSVP) of emotional and neutral pictures. Despite the extremely brief presentation, emotionally arousing pictures prompted an enhanced negative voltage over occipital sensors, compared to neutral pictures, replicating previous encoding effects. Emotionally arousing pictures were also remembered better in a subsequent recognition test, with higher hit rates and better discrimination performance. ERPs measured during the recognition test showed both an early (250-350 ms) frontally distributed difference between hits and correct rejections, and a later (400-500 ms), more centrally distributed difference, consistent with effects of recognition on ERPs typically found using slower presentation rates. The data are consistent with the hypothesis that features of affective pictures pop out during rapid serial visual presentation, prompting better memory performance.
Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N
2015-06-01
The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja
2016-09-01
Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.
Emotion Knowledge and Attentional Differences in Preschoolers Showing Context-Inappropriate Anger.
Locke, Robin L; Lang, Nichole J
2016-08-01
Some children show anger inappropriate for the situation based on the predominant incentives, which is called context-inappropriate anger. Children need to attend to and interpret situational incentives for appropriate emotional responses. We examined associations of context-inappropriate anger with emotion recognition and attention problems in 43 preschoolers (42% male; M age = 55.1 months, SD = 4.1). Parents rated context-inappropriate anger across situations. Teachers rated attention problems using the Child Behavior Checklist-Teacher Report Form. Emotion recognition was ability to recognize emotional faces using the Emotion Matching Test. Anger perception bias was indicated by anger to non-anger situations using an adapted Affect Knowledge Test. 28% of children showed context-inappropriate anger, which correlated with lower emotion recognition (β = -.28) and higher attention problems (β = .36). Higher attention problems correlated with more anger perception bias (β = .32). This cross-sectional, correlational study provides preliminary findings that children with context-inappropriate anger showed more attention problems, which suggests that both "problems" tend to covary and associate with deficits or biases in emotion knowledge. © The Author(s) 2016.
Biases in facial and vocal emotion recognition in chronic schizophrenia
Dondaine, Thibaut; Robert, Gabriel; Péron, Julie; Grandjean, Didier; Vérin, Marc; Drapier, Dominique; Millet, Bruno
2014-01-01
There has been extensive research on impaired emotion recognition in schizophrenia in the facial and vocal modalities. The literature points to biases toward non-relevant emotions for emotional faces but few studies have examined biases in emotional recognition across different modalities (facial and vocal). In order to test emotion recognition biases, we exposed 23 patients with stabilized chronic schizophrenia and 23 healthy controls (HCs) to emotional facial and vocal tasks asking them to rate emotional intensity on visual analog scales. We showed that patients with schizophrenia provided higher intensity ratings on the non-target scales (e.g., surprise scale for fear stimuli) than HCs for the both tasks. Furthermore, with the exception of neutral vocal stimuli, they provided the same intensity ratings on the target scales as the HCs. These findings suggest that patients with chronic schizophrenia have emotional biases when judging emotional stimuli in the visual and vocal modalities. These biases may stem from a basic sensorial deficit, a high-order cognitive dysfunction, or both. The respective roles of prefrontal-subcortical circuitry and the basal ganglia are discussed. PMID:25202287
Assessment of Emotional Experience and Emotional Recognition in Complicated Grief
Fernández-Alcántara, Manuel; Cruz-Quintana, Francisco; Pérez-Marfil, M. N.; Catena-Martínez, Andrés; Pérez-García, Miguel; Turnbull, Oliver H.
2016-01-01
There is substantial evidence of bias in the processing of emotion in people with complicated grief (CG). Previous studies have tended to assess the expression of emotion in CG, but other aspects of emotion (mainly emotion recognition, and the subjective aspects of emotion) have not been addressed, despite their importance for practicing clinicians. A quasi-experimental design with two matched groups (Complicated Grief, N = 24 and Non-Complicated Grief, N = 20) was carried out. The Facial Expression of Emotion Test (emotion recognition), a set of pictures from the International Affective Picture System (subjective experience of emotion) and the Symptom Checklist 90 Revised (psychopathology) were employed. The CG group showed lower scores on the dimension of valence for specific conditions on the IAPS, related to the subjective experience of emotion. In addition, they presented higher values of psychopathology. In contrast, statistically significant results were not found for the recognition of emotion. In conclusion, from a neuropsychological point of view, the subjective aspects of emotion and psychopathology seem central in explaining the experience of those with CG. These results are clinically significant for psychotherapists and psychoanalysts working in the field of grief and loss. PMID:26903928
Yang, Chengqing; Zhang, Tianhong; Li, Zezhi; Heeramun-Aubeeluck, Anisha; Liu, Na; Huang, Nan; Zhang, Jie; He, Leiying; Li, Hui; Tang, Yingying; Chen, Fazhan; Liu, Fei; Wang, Jijun; Lu, Zheng
2015-10-08
Although many studies have examined executive functions and facial emotion recognition in people with schizophrenia, few of them focused on the correlation between them. Furthermore, their relationship in the siblings of patients also remains unclear. The aim of the present study is to examine the correlation between executive functions and facial emotion recognition in patients with first-episode schizophrenia and their siblings. Thirty patients with first-episode schizophrenia, their twenty-six siblings, and thirty healthy controls were enrolled. They completed facial emotion recognition tasks using the Ekman Standard Faces Database, and executive functioning was measured by Wisconsin Card Sorting Test (WCST). Hierarchical regression analysis was applied to assess the correlation between executive functions and facial emotion recognition. Our study found that in siblings, the accuracy in recognizing low degree 'disgust' emotion was negatively correlated with the total correct rate in WCST (r = -0.614, p = 0.023), but was positively correlated with the total error in WCST (r = 0.623, p = 0.020); the accuracy in recognizing 'neutral' emotion was positively correlated with the total error rate in WCST (r = 0.683, p = 0.014) while negatively correlated with the total correct rate in WCST (r = -0.677, p = 0.017). People with schizophrenia showed an impairment in facial emotion recognition when identifying moderate 'happy' facial emotion, the accuracy of which was significantly correlated with the number of completed categories of WCST (R(2) = 0.432, P < .05). There were no correlations between executive functions and facial emotion recognition in the healthy control group. Our study demonstrated that facial emotion recognition impairment correlated with executive function impairment in people with schizophrenia and their unaffected siblings but not in healthy controls.
Demirci, Esra; Erdogan, Ayten
2016-12-01
The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.
Advanced Parkinson disease patients have impairment in prosody processing.
Albuquerque, Luisa; Martins, Maurício; Coelho, Miguel; Guedes, Leonor; Ferreira, Joaquim J; Rosa, Mário; Martins, Isabel Pavão
2016-01-01
The ability to recognize and interpret emotions in others is a crucial prerequisite of adequate social behavior. Impairments in emotion processing have been reported from the early stages of Parkinson's disease (PD). This study aims to characterize emotion recognition in advanced Parkinson's disease (APD) candidates for deep-brain stimulation and to compare emotion recognition abilities in visual and auditory domains. APD patients, defined as those with levodopa-induced motor complications (N = 42), and healthy controls (N = 43) matched by gender, age, and educational level, undertook the Comprehensive Affect Testing System (CATS), a battery that evaluates recognition of seven basic emotions (happiness, sadness, anger, fear, surprise, disgust, and neutral) on facial expressions and four emotions on prosody (happiness, sadness, anger, and fear). APD patients were assessed during the "ON" state. Group performance was compared with independent-samples t tests. Compared to controls, APD had significantly lower scores on the discrimination and naming of emotions in prosody, and visual discrimination of neutral faces, but no significant differences in visual emotional tasks. The contrasting performance in emotional processing between visual and auditory stimuli suggests that APD candidates for surgery have either a selective difficulty in recognizing emotions in prosody or a general defect in prosody processing. Studies investigating early-stage PD, and the effect of subcortical lesions in prosody processing, favor the latter interpretation. Further research is needed to understand these deficits in emotional prosody recognition and their possible contribution to later behavioral or neuropsychiatric manifestations of PD.
A Diffusion Model Analysis of Decision Biases Affecting Delayed Recognition of Emotional Stimuli
Bowen, Holly J.; Spaniol, Julia; Patel, Ronak; Voss, Andreas
2016-01-01
Previous empirical work suggests that emotion can influence accuracy and cognitive biases underlying recognition memory, depending on the experimental conditions. The current study examines the effects of arousal and valence on delayed recognition memory using the diffusion model, which allows the separation of two decision biases thought to underlie memory: response bias and memory bias. Memory bias has not been given much attention in the literature but can provide insight into the retrieval dynamics of emotion modulated memory. Participants viewed emotional pictorial stimuli; half were given a recognition test 1-day later and the other half 7-days later. Analyses revealed that emotional valence generally evokes liberal responding, whereas high arousal evokes liberal responding only at a short retention interval. The memory bias analyses indicated that participants experienced greater familiarity with high-arousal compared to low-arousal items and this pattern became more pronounced as study-test lag increased; positive items evoke greater familiarity compared to negative and this pattern remained stable across retention interval. The findings provide insight into the separate contributions of valence and arousal to the cognitive mechanisms underlying delayed emotion modulated memory. PMID:26784108
Emotion recognition ability in mothers at high and low risk for child physical abuse.
Balge, K A; Milner, J S
2000-10-01
The study sought to determine if high-risk, compared to low-risk, mothers make more emotion recognition errors when they attempt to recognize emotions in children and adults. Thirty-two demographically matched high-risk (n = 16) and low-risk (n = 16) mothers were asked to identify different emotions expressed by children and adults. Sets of high- and low-intensity, visual and auditory emotions were presented. Mothers also completed measures of stress, depression, and ego-strength. High-risk, compared to low-risk, mothers showed a tendency to make more errors on the visual and auditory emotion recognition tasks, with a trend toward more errors on the low-intensity, visual stimuli. However, the observed trends were not significant. Only a post-hoc test of error rates across all stimuli indicated that high-risk, compared to low-risk, mothers made significantly more emotion recognition errors. Although situational stress differences were not found, high-risk mothers reported significantly higher levels of general parenting stress and depression and lower levels of ego-strength. Since only trends and a significant post hoc finding of more overall emotion recognition errors in high-risk mothers were observed, additional research is needed to determine if high-risk mothers have emotion recognition deficits that may impact parent-child interactions. As in prior research, the study found that high-risk mothers reported more parenting stress and depression and less ego-strength.
Bihippocampal damage with emotional dysfunction: impaired auditory recognition of fear.
Ghika-Schmid, F; Ghika, J; Vuilleumier, P; Assal, G; Vuadens, P; Scherer, K; Maeder, P; Uske, A; Bogousslavsky, J
1997-01-01
A right-handed man developed a sudden transient, amnestic syndrome associated with bilateral hemorrhage of the hippocampi, probably due to Urbach-Wiethe disease. In the 3rd month, despite significant hippocampal structural damage on imaging, only a milder degree of retrograde and anterograde amnesia persisted on detailed neuropsychological examination. On systematic testing of recognition of facial and vocal expression of emotion, we found an impairment of the vocal perception of fear, but not that of other emotions, such as joy, sadness and anger. Such selective impairment of fear perception was not present in the recognition of facial expression of emotion. Thus emotional perception varies according to the different aspects of emotions and the different modality of presentation (faces versus voices). This is consistent with the idea that there may be multiple emotion systems. The study of emotional perception in this unique case of bilateral involvement of hippocampus suggests that this structure may play a critical role in the recognition of fear in vocal expression, possibly dissociated from that of other emotions and from that of fear in facial expression. In regard of recent data suggesting that the amygdala is playing a role in the recognition of fear in the auditory as well as in the visual modality this could suggest that the hippocampus may be part of the auditory pathway of fear recognition.
Dynorphins regulate the strength of social memory.
Bilkei-Gorzo, A; Mauer, D; Michel, K; Zimmer, A
2014-02-01
Emotionally arousing events like encounter with an unfamiliar con-species produce strong and vivid memories, whereby the hippocampus and amygdala play a crucial role. It is less understood, however, which neurotransmitter systems regulate the strength of social memories, which have a strong emotional component. It was shown previously that dynorphin signalling is involved in the formation and extinction of fear memories, therefore we asked if it influences social memories as well. Mice with a genetic deletion of the prodynorphin gene Pdyn (Pdyn(-/-)) showed a superior partner recognition ability, whereas their performance in the object recognition test was identical as in wild-type mice. Pharmacological blockade of kappa opioid receptors (KORs) led to an enhanced social memory in wild-type animals, whereas activation of KORs reduced the recognition ability of Pdyn(-/-) mice. Partner recognition test situation induced higher elevation in dynorphin A levels in the central and basolateral amygdala as well as in the hippocampus, and also higher dynorphin B levels in the hippocampus than the object recognition test situation. Our result suggests that dynorphin system activity is increased in emotionally arousing situation and it decreases the formation of social memories. Thus, dynorphin signalling is involved in the formation of social memories by diminishing the emotional component of the experience. Copyright © 2013 Elsevier Ltd. All rights reserved.
von Piekartz, H; Lüers, J; Daumeyer, H; Mohr, G
2017-10-01
The aim of this study is to investigate the effects of kinesiophobia on emotion recognition and left/right judgement. A total of 67 patients with chronic musculoskeletal pain were tested. In all, 24 patients achieved a score >37 on the Tampa Scale of Kinesiophobia and were included in the study. The ability to recognize basic emotions coded through facial expression was assessed using the Facially Expressed Emotion Labeling (FEEL) test. Left/right judgement was evaluated using a special Face-mirroring Assessment and Treatment program. The Toronto Alexithymia Scale-26 (TAS-26) was used to assess if the patients showed signs of alexithymia. The FEEL score of patients with kinesiophobia was significantly lower (p = 0.019). The recognition of the basic emotions fear (p = 0.026), anger (p = 0.027), and surprise (p = 0.014) showed significant differences in comparison to unaffected subjects. The basic emotion surprise was recognized more often by patients with kinesiophobia (p = 0.014). Only Scale 1 of the TAS-26 (identification problems of emotions) showed a significant difference between patients with kinesiophobia (p = 0.008) and healthy subjects. The results show that kinesiophobic patients have altered recognition of emotions, problems in left/right judgement, and show signs of alexithymia.
Lodder, Gerine M A; Scholte, Ron H J; Goossens, Luc; Engels, Rutger C M E; Verhagen, Maaike
2016-02-01
Based on the belongingness regulation theory (Gardner et al., 2005, Pers. Soc. Psychol. Bull., 31, 1549), this study focuses on the relationship between loneliness and social monitoring. Specifically, we examined whether loneliness relates to performance on three emotion recognition tasks and whether lonely individuals show increased gazing towards their conversation partner's faces in a real-life conversation. Study 1 examined 170 college students (Mage = 19.26; SD = 1.21) who completed an emotion recognition task with dynamic stimuli (morph task) and a micro(-emotion) expression recognition task. Study 2 examined 130 college students (Mage = 19.33; SD = 2.00) who completed the Reading the Mind in the Eyes Test and who had a conversation with an unfamiliar peer while their gaze direction was videotaped. In both studies, loneliness was measured using the UCLA Loneliness Scale version 3 (Russell, 1996, J. Pers. Assess., 66, 20). The results showed that loneliness was unrelated to emotion recognition on all emotion recognition tasks, but that it was related to increased gaze towards their conversation partner's faces. Implications for the belongingness regulation system of lonely individuals are discussed. © 2015 The British Psychological Society.
The automaticity of emotion recognition.
Tracy, Jessica L; Robins, Richard W
2008-02-01
Evolutionary accounts of emotion typically assume that humans evolved to quickly and efficiently recognize emotion expressions because these expressions convey fitness-enhancing messages. The present research tested this assumption in 2 studies. Specifically, the authors examined (a) how quickly perceivers could recognize expressions of anger, contempt, disgust, embarrassment, fear, happiness, pride, sadness, shame, and surprise; (b) whether accuracy is improved when perceivers deliberate about each expression's meaning (vs. respond as quickly as possible); and (c) whether accurate recognition can occur under cognitive load. Across both studies, perceivers quickly and efficiently (i.e., under cognitive load) recognized most emotion expressions, including the self-conscious emotions of pride, embarrassment, and shame. Deliberation improved accuracy in some cases, but these improvements were relatively small. Discussion focuses on the implications of these findings for the cognitive processes underlying emotion recognition.
Golan, Ofer; Ashwin, Emma; Granader, Yael; McClintock, Suzy; Day, Kate; Leggett, Victoria; Baron-Cohen, Simon
2010-03-01
This study evaluated The Transporters, an animated series designed to enhance emotion comprehension in children with autism spectrum conditions (ASC). n = 20 children with ASC (aged 4-7) watched The Transporters everyday for 4 weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three levels of generalization. Two matched control groups of children (ASC group, n = 18 and typically developing group, n = 18) were also assessed twice without any intervention. The intervention group improved significantly more than the clinical control group on all task levels, performing comparably to typical controls at Time 2. We conclude that using The Transporters significantly improves emotion recognition in children with ASC. Future research should evaluate the series' effectiveness with lower-functioning individuals.
Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young
2017-01-01
Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer’s disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106) = 10.829, p < 0.001, η2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional recognition processing is a multi-modal system in the brain. Focusing on the negative emotion recognition is a more effective way to discriminate healthy aging, MCI, and AD from FTD in older Korean adults. PMID:29249960
Milders, Maarten; Ietswaart, Magdalena; Crawford, John R; Currie, David
2008-03-01
Although the adverse consequences of changes in social behavior following traumatic brain injury (TBI) are well documented, relatively little is known about possible underlying neuropsychological deficits. Following a model originally developed for social behavior deficits in schizophrenia, we investigated whether impairments in emotion recognition, understanding of other people's intentions ("theory of mind"), and cognitive flexibility soon after first TBI or 1 year later were associated with self and proxy ratings of behavior following TBI. Each of the three functions was assessed with two separate tests, and ratings of behavior were collected on three questionnaires. Patients with TBI (n = 33) were impaired in emotion recognition, "theory of mind," and cognitive flexibility compared with matched orthopedic controls (n = 34). Proxy ratings showed increases in behavioral problems 1 year following injury in the TBI group but not in the control group. However, test performance was not associated with questionnaire data. Severity of the impairments in emotion recognition, understanding intention, and flexibility were unrelated to the severity of behavioral problems following TBI. These findings failed to confirm the used model for social behavior deficits and may cast doubt on the alleged link between deficits in emotion recognition or theory of mind and social functioning.
Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.
Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M
2017-01-01
Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).
The Effects of Cognitive Reappraisal and Expressive Suppression on Memory of Emotional Pictures.
Wang, Yan Mei; Chen, Jie; Han, Ben Yue
2017-01-01
In the field of emotion research, the influence of emotion regulation strategies on memory with emotional materials has been widely discussed in recent years. However, existing studies have focused exclusively on regulating negative emotion but not positive emotion. Therefore, in the present study, we investigated the influence of emotion regulation strategies for positive emotion on memory. One hundred and twenty college students were selected as participants. Emotional pictures (positive, negative and neutral) were selected from Chinese Affective Picture System (CAPS) as experimental materials. We employed a mixed, 4 (emotion regulation strategies: cognitive up-regulation, cognitive down-regulation, expressive suppression, passive viewing) × 3 (emotional pictures: positive, neutral, negative) experimental design. We investigated the influences of different emotion regulation strategies on memory performance, using free recall and recognition tasks with pictures varying in emotional content. The results showed that recognition and free recall memory performance of the cognitive reappraisal groups (up-regulation and down-regulation) were both better than that of the passive viewing group for all emotional pictures. No significant differences were reported in the two kinds of memory scores between the expressive suppression and passive viewing groups. The results also showed that the memory performance with the emotional pictures differed according to the form of memory test. For the recognition test, participants performed better with positive images than with neutral images. Free recall scores with negative images were higher than those with neutral images. These results suggest that both cognitive reappraisal regulation strategies (up-regulation and down-regulation) promoted explicit memories of the emotional content of stimuli, and the form of memory test influenced performance with emotional pictures.
Empathy costs: Negative emotional bias in high empathisers.
Chikovani, George; Babuadze, Lasha; Iashvili, Nino; Gvalia, Tamar; Surguladze, Simon
2015-09-30
Excessive empathy has been associated with compassion fatigue in health professionals and caregivers. We investigated an effect of empathy on emotion processing in 137 healthy individuals of both sexes. We tested a hypothesis that high empathy may underlie increased sensitivity to negative emotion recognition which may interact with gender. Facial emotion stimuli comprised happy, angry, fearful, and sad faces presented at different intensities (mild and prototypical) and different durations (500ms and 2000ms). The parameters of emotion processing were represented by discrimination accuracy, response bias and reaction time. We found that higher empathy was associated with better recognition of all emotions. We also demonstrated that higher empathy was associated with response bias towards sad and fearful faces. The reaction time analysis revealed that higher empathy in females was associated with faster (compared with males) recognition of mildly sad faces of brief duration. We conclude that although empathic abilities were providing for advantages in recognition of all facial emotional expressions, the bias towards emotional negativity may potentially carry a risk for empathic distress. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Recognizing biological motion and emotions from point-light displays in autism spectrum disorders.
Nackaerts, Evelien; Wagemans, Johan; Helsen, Werner; Swinnen, Stephan P; Wenderoth, Nicole; Alaerts, Kaat
2012-01-01
One of the main characteristics of Autism Spectrum Disorder (ASD) are problems with social interaction and communication. Here, we explored ASD-related alterations in 'reading' body language of other humans. Accuracy and reaction times were assessed from two observational tasks involving the recognition of 'biological motion' and 'emotions' from point-light displays (PLDs). Eye movements were recorded during the completion of the tests. Results indicated that typically developed-participants were more accurate than ASD-subjects in recognizing biological motion or emotions from PLDs. No accuracy differences were revealed on two control-tasks (involving the indication of color-changes in the moving point-lights). Group differences in reaction times existed on all tasks, but effect sizes were higher for the biological and emotion recognition tasks. Biological motion recognition abilities were related to a person's ability to recognize emotions from PLDs. However, ASD-related atypicalities in emotion recognition could not entirely be attributed to more basic deficits in biological motion recognition, suggesting an additional ASD-specific deficit in recognizing the emotional dimension of the point light displays. Eye movements were assessed during the completion of tasks and results indicated that ASD-participants generally produced more saccades and shorter fixation-durations compared to the control-group. However, especially for emotion recognition, these altered eye movements were associated with reductions in task-performance.
Age, gender, and puberty influence the development of facial emotion recognition.
Lawrence, Kate; Campbell, Ruth; Skuse, David
2015-01-01
Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children's ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children's ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.
Age, gender, and puberty influence the development of facial emotion recognition
Lawrence, Kate; Campbell, Ruth; Skuse, David
2015-01-01
Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6–16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6–16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers. PMID:26136697
Recognition of Emotions in Autism: A Formal Meta-Analysis
ERIC Educational Resources Information Center
Uljarevic, Mirko; Hamilton, Antonia
2013-01-01
Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants…
Development of Emotional Facial Recognition in Late Childhood and Adolescence
ERIC Educational Resources Information Center
Thomas, Laura A.; De Bellis, Michael D.; Graham, Reiko; Labar, Kevin S.
2007-01-01
The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents…
Yu, Shao Hua; Zhu, Jun Peng; Xu, You; Zheng, Lei Lei; Chai, Hao; He, Wei; Liu, Wei Bo; Li, Hui Chun; Wang, Wei
2012-12-01
To study the contribution of executive function to abnormal recognition of facial expressions of emotion in schizophrenia patients. Abnormal recognition of facial expressions of emotion was assayed according to Japanese and Caucasian facial expressions of emotion (JACFEE), Wisconsin card sorting test (WCST), positive and negative symptom scale, and Hamilton anxiety and depression scale, respectively, in 88 paranoid schizophrenia patients and 75 healthy volunteers. Patients scored higher on the Positive and Negative Symptom Scale and the Hamilton Anxiety and Depression Scales, displayed lower JACFEE recognition accuracies and poorer WCST performances. The JACFEE recognition accuracy of contempt and disgust was negatively correlated with the negative symptom scale score while the recognition accuracy of fear was positively with the positive symptom scale score and the recognition accuracy of surprise was negatively with the general psychopathology score in patients. Moreover, the WCST could predict the JACFEE recognition accuracy of contempt, disgust, and sadness in patients, and the perseverative errors negatively predicted the recognition accuracy of sadness in healthy volunteers. The JACFEE recognition accuracy of sadness could predict the WCST categories in paranoid schizophrenia patients. Recognition accuracy of social-/moral emotions, such as contempt, disgust and sadness is related to the executive function in paranoid schizophrenia patients, especially when regarding sadness. Copyright © 2012 The Editorial Board of Biomedical and Environmental Sciences. Published by Elsevier B.V. All rights reserved.
Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R
2006-06-01
Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.
Emotional memory and perception in temporal lobectomy patients with amygdala damage.
Brierley, B; Medford, N; Shaw, P; David, A S
2004-04-01
The human amygdala is implicated in the formation of emotional memories and the perception of emotional stimuli--particularly fear--across various modalities. To discern the extent to which these functions are related. 28 patients who had anterior temporal lobectomy (13 left and 15 right) for intractable epilepsy were recruited. Structural magnetic resonance imaging showed that three of them had atrophy of their remaining amygdala. All participants were given tests of affect perception from facial and vocal expressions and of emotional memory, using a standard narrative test and a novel test of word recognition. The results were standardised against matched healthy controls. Performance on all emotion tasks in patients with unilateral lobectomy ranged from unimpaired to moderately impaired. Perception of emotions in faces and voices was (with exceptions) significantly positively correlated, indicating multimodal emotional processing. However, there was no correlation between the subjects' performance on tests of emotional memory and perception. Several subjects showed strong emotional memory enhancement but poor fear perception. Patients with bilateral amygdala damage had greater impairment, particularly on the narrative test of emotional memory, one showing superior fear recognition but absent memory enhancement. Bilateral amygdala damage is particularly disruptive of emotional memory processes in comparison with unilateral temporal lobectomy. On a cognitive level, the pattern of results implies that perception of emotional expressions and emotional memory are supported by separate processing systems or streams.
Colzato, Lorenza S; Sellaro, Roberta; Beste, Christian
2017-07-01
Charles Darwin proposed that via the vagus nerve, the tenth cranial nerve, emotional facial expressions are evolved, adaptive and serve a crucial communicative function. In line with this idea, the later-developed polyvagal theory assumes that the vagus nerve is the key phylogenetic substrate that regulates emotional and social behavior. The polyvagal theory assumes that optimal social interaction, which includes the recognition of emotion in faces, is modulated by the vagus nerve. So far, in humans, it has not yet been demonstrated that the vagus plays a causal role in emotion recognition. To investigate this we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that modulates brain activity via bottom-up mechanisms. A sham/placebo-controlled, randomized cross-over within-subjects design was used to infer a causal relation between the stimulated vagus nerve and the related ability to recognize emotions as indexed by the Reading the Mind in the Eyes Test in 38 healthy young volunteers. Active tVNS, compared to sham stimulation, enhanced emotion recognition for easy items, suggesting that it promoted the ability to decode salient social cues. Our results confirm that the vagus nerve is causally involved in emotion recognition, supporting Darwin's argumentation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cross-cultural emotional prosody recognition: evidence from Chinese and British listeners.
Paulmann, Silke; Uskul, Ayse K
2014-01-01
This cross-cultural study of emotional tone of voice recognition tests the in-group advantage hypothesis (Elfenbein & Ambady, 2002) employing a quasi-balanced design. Individuals of Chinese and British background were asked to recognise pseudosentences produced by Chinese and British native speakers, displaying one of seven emotions (anger, disgust, fear, happy, neutral tone of voice, sad, and surprise). Findings reveal that emotional displays were recognised at rates higher than predicted by chance; however, members of each cultural group were more accurate in recognising the displays communicated by a member of their own cultural group than a member of the other cultural group. Moreover, the evaluation of error matrices indicates that both culture groups relied on similar mechanism when recognising emotional displays from the voice. Overall, the study reveals evidence for both universal and culture-specific principles in vocal emotion recognition.
Intelligibility of emotional speech in younger and older adults.
Dupuis, Kate; Pichora-Fuller, M Kathleen
2014-01-01
Little is known about the influence of vocal emotions on speech understanding. Word recognition accuracy for stimuli spoken to portray seven emotions (anger, disgust, fear, sadness, neutral, happiness, and pleasant surprise) was tested in younger and older listeners. Emotions were presented in either mixed (heterogeneous emotions mixed in a list) or blocked (homogeneous emotion blocked in a list) conditions. Three main hypotheses were tested. First, vocal emotion affects word recognition accuracy; specifically, portrayals of fear enhance word recognition accuracy because listeners orient to threatening information and/or distinctive acoustical cues such as high pitch mean and variation. Second, older listeners recognize words less accurately than younger listeners, but the effects of different emotions on intelligibility are similar across age groups. Third, blocking emotions in list results in better word recognition accuracy, especially for older listeners, and reduces the effect of emotion on intelligibility because as listeners develop expectations about vocal emotion, the allocation of processing resources can shift from emotional to lexical processing. Emotion was the within-subjects variable: all participants heard speech stimuli consisting of a carrier phrase followed by a target word spoken by either a younger or an older talker, with an equal number of stimuli portraying each of seven vocal emotions. The speech was presented in multi-talker babble at signal to noise ratios adjusted for each talker and each listener age group. Listener age (younger, older), condition (mixed, blocked), and talker (younger, older) were the main between-subjects variables. Fifty-six students (Mage= 18.3 years) were recruited from an undergraduate psychology course; 56 older adults (Mage= 72.3 years) were recruited from a volunteer pool. All participants had clinically normal pure-tone audiometric thresholds at frequencies ≤3000 Hz. There were significant main effects of emotion, listener age group, and condition on the accuracy of word recognition in noise. Stimuli spoken in a fearful voice were the most intelligible, while those spoken in a sad voice were the least intelligible. Overall, word recognition accuracy was poorer for older than younger adults, but there was no main effect of talker, and the pattern of the effects of different emotions on intelligibility did not differ significantly across age groups. Acoustical analyses helped elucidate the effect of emotion and some intertalker differences. Finally, all participants performed better when emotions were blocked. For both groups, performance improved over repeated presentations of each emotion in both blocked and mixed conditions. These results are the first to demonstrate a relationship between vocal emotion and word recognition accuracy in noise for younger and older listeners. In particular, the enhancement of intelligibility by emotion is greatest for words spoken to portray fear and presented heterogeneously with other emotions. Fear may have a specialized role in orienting attention to words heard in noise. This finding may be an auditory counterpart to the enhanced detection of threat information in visual displays. The effect of vocal emotion on word recognition accuracy is preserved in older listeners with good audiograms and both age groups benefit from blocking and the repetition of emotions.
Emotional content enhances true but not false memory for categorized stimuli.
Choi, Hae-Yoon; Kensinger, Elizabeth A; Rajaram, Suparna
2013-04-01
Past research has shown that emotion enhances true memory, but that emotion can either increase or decrease false memory. Two theoretical possibilities-the distinctiveness of emotional stimuli and the conceptual relatedness of emotional content-have been implicated as being responsible for influencing both true and false memory for emotional content. In the present study, we sought to identify the mechanisms that underlie these mixed findings by equating the thematic relatedness of the study materials across each type of valence used (negative, positive, or neutral). In three experiments, categorically bound stimuli (e.g., funeral, pets, and office items) were used for this purpose. When the encoding task required the processing of thematic relatedness, a significant true-memory enhancement for emotional content emerged in recognition memory, but no emotional boost to false memory (exp. 1). This pattern persisted for true memory with a longer retention interval between study and test (24 h), and false recognition was reduced for emotional items (exp. 2). Finally, better recognition memory for emotional items once again emerged when the encoding task (arousal ratings) required the processing of the emotional aspect of the study items, with no emotional boost to false recognition (EXP. 3). Together, these findings suggest that when emotional and neutral stimuli are equivalently high in thematic relatedness, emotion continues to improve true memory, but it does not override other types of grouping to increase false memory.
Alexithymia and Mood: Recognition of Emotion in Self and Others.
Lyvers, Michael; Kohlsdorf, Susan M; Edwards, Mark S; Thorberg, Fred Arne
2017-01-01
The present study explored relationships between alexithymia-a trait characterized by difficulties identifying and describing feelings and an external thinking style-and negative moods, negative mood regulation expectancies, facial recognition of emotions, emotional empathy, and alcohol consumption. The sample consisted of 102 university (primarily psychology) students (13 men, 89 women) aged 18 to 50 years (M = 22.18 years). Participants completed the Toronto Alexithymia Scale (TAS-20), Negative Mood Regulation Scale (NMRS), Depression Anxiety Stress Scales (DASS-21), Reading the Mind in the Eyes Test (RMET), Interpersonal Reactivity Index (IRI), and Alcohol Use Disorders Identification Test (AUDIT). Results were consistent with previous findings of positive relationships of TAS-20 alexithymia scores with both alcohol use (AUDIT) and negative moods (DASS-21) and a negative relationship with emotional self-regulation as indexed by NMRS. Predicted negative associations of both overall TAS-20 alexithymia scores and the externally oriented thinking (EOT) subscale of the TAS-20 with both RMET facial recognition of emotions and the empathic concern (EC) subscale of the IRI were supported. The mood self-regulation index NMRS fully mediated the relationship between alexithymia and negative moods. Hierarchical linear regressions revealed that, after other relevant variables were controlled for, the EOT subscale of the TAS-20 predicted RMET and EC. The concrete thinking or EDT facet of alexithymia thus appears to be associated with diminished facial recognition of emotions and reduced emotional empathy. The negative moods associated with alexithymia appear to be linked to subjective difficulties in self-regulation of emotions.
Oerlemans, Anoek M; van der Meer, Jolanda M J; van Steijn, Daphne J; de Ruiter, Saskia W; de Bruijn, Yvette G E; de Sonneville, Leo M J; Buitelaar, Jan K; Rommelse, Nanda N J
2014-05-01
Autism is a highly heritable and clinically heterogeneous neuropsychiatric disorder that frequently co-occurs with other psychopathologies, such as attention-deficit/hyperactivity disorder (ADHD). An approach to parse heterogeneity is by forming more homogeneous subgroups of autism spectrum disorder (ASD) patients based on their underlying, heritable cognitive vulnerabilities (endophenotypes). Emotion recognition is a likely endophenotypic candidate for ASD and possibly for ADHD. Therefore, this study aimed to examine whether emotion recognition is a viable endophenotypic candidate for ASD and to assess the impact of comorbid ADHD in this context. A total of 90 children with ASD (43 with and 47 without ADHD), 79 ASD unaffected siblings, and 139 controls aged 6-13 years, were included to test recognition of facial emotion and affective prosody. Our results revealed that the recognition of both facial emotion and affective prosody was impaired in children with ASD and aggravated by the presence of ADHD. The latter could only be partly explained by typical ADHD cognitive deficits, such as inhibitory and attentional problems. The performance of unaffected siblings could overall be considered at an intermediate level, performing somewhat worse than the controls and better than the ASD probands. Our findings suggest that emotion recognition might be a viable endophenotype in ASD and a fruitful target in future family studies of the genetic contribution to ASD and comorbid ADHD. Furthermore, our results suggest that children with comorbid ASD and ADHD are at highest risk for emotion recognition problems.
Recognizing Biological Motion and Emotions from Point-Light Displays in Autism Spectrum Disorders
Nackaerts, Evelien; Wagemans, Johan; Helsen, Werner; Swinnen, Stephan P.; Wenderoth, Nicole; Alaerts, Kaat
2012-01-01
One of the main characteristics of Autism Spectrum Disorder (ASD) are problems with social interaction and communication. Here, we explored ASD-related alterations in ‘reading’ body language of other humans. Accuracy and reaction times were assessed from two observational tasks involving the recognition of ‘biological motion’ and ‘emotions’ from point-light displays (PLDs). Eye movements were recorded during the completion of the tests. Results indicated that typically developed-participants were more accurate than ASD-subjects in recognizing biological motion or emotions from PLDs. No accuracy differences were revealed on two control-tasks (involving the indication of color-changes in the moving point-lights). Group differences in reaction times existed on all tasks, but effect sizes were higher for the biological and emotion recognition tasks. Biological motion recognition abilities were related to a person’s ability to recognize emotions from PLDs. However, ASD-related atypicalities in emotion recognition could not entirely be attributed to more basic deficits in biological motion recognition, suggesting an additional ASD-specific deficit in recognizing the emotional dimension of the point light displays. Eye movements were assessed during the completion of tasks and results indicated that ASD-participants generally produced more saccades and shorter fixation-durations compared to the control-group. However, especially for emotion recognition, these altered eye movements were associated with reductions in task-performance. PMID:22970227
Gender Differences in the Recognition of Vocal Emotions
Lausen, Adi; Schacht, Annekathrin
2018-01-01
The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these factors to explain recognition ability in the processing of emotional prosody. PMID:29922202
Rigon, Arianna; Turkstra, Lyn; Mutlu, Bilge; Duff, Melissa
2016-10-01
Although moderate to severe traumatic brain injury (TBI) leads to facial affect recognition impairments in up to 39% of individuals, protective and risk factors for these deficits are unknown. The aim of the current study was to examine the effect of sex on emotion recognition abilities following TBI. We administered two separate emotion recognition tests (one static and one dynamic) to 53 individuals with moderate to severe TBI (females = 28) and 49 demographically matched comparisons (females = 22). We then investigated the presence of a sex-by-group interaction in emotion recognition accuracy. In the comparison group, there were no sex differences. In the TBI group, however, females significantly outperformed males in the dynamic (but not the static) task. Moreover, males (but not females) with TBI performed significantly worse than comparison participants in the dynamic task. Further analysis revealed that sex differences in emotion recognition abilities within the TBI group could not be explained by lesion location, TBI severity, or other neuropsychological variables. These findings suggest that sex may serve as a protective factor for social impairment following TBI and inform clinicians working with TBI as well as research on the neurophysiological correlates of sex differences in social functioning.
Li, Shijia; Weerda, Riklef; Guenzel, Friederike; Wolf, Oliver T; Thiel, Christiane M
2013-07-01
Previous studies have shown that acute psychosocial stress impairs retrieval of declarative memory with emotional material being especially sensitive to this effect. A functional deletion variant of the ADRA2B gene encoding the α2B-adrenergic receptor has been shown to increase emotional memory and neural activity in the amygdala. We investigated the effects of acute psychosocial stress and the ADRA2B allele on recognition memory for emotional and neutral faces. Fourty-two healthy, non-smoker male volunteers (30 deletion carriers, 12 noncarriers) were tested with a face recognition paradigm. During encoding they were presented with emotional and neutral faces. One hour later, participants underwent either a stress ("Trier Social Stress Test (TSST)") or a control procedure which was followed immediately by the retrieval session where subjects had to indicate whether the presented face was old or new. Stress increased salivary cortisol concentrations, blood pressure and pulse and impaired recognition memory for faces independent of emotional valence and genotype. Participants showed generally slower reaction times to emotional faces. Carriers of the ADRA2B functional deletion variant showed an impaired recognition and slower retrieval of neutral faces under stress. Further, they were significantly slower in retrieving fearful faces in the control condition. The findings indicate that a genetic variation of the noradrenergic system may preserve emotional faces from stress-induced memory impairments seen for neutral faces and heighten reactivity to emotional stimuli under control conditions. Copyright © 2013 Elsevier Inc. All rights reserved.
Cost-sensitive learning for emotion robust speaker recognition.
Li, Dongdong; Yang, Yingchun; Dai, Weihui
2014-01-01
In the field of information security, voice is one of the most important parts in biometrics. Especially, with the development of voice communication through the Internet or telephone system, huge voice data resources are accessed. In speaker recognition, voiceprint can be applied as the unique password for the user to prove his/her identity. However, speech with various emotions can cause an unacceptably high error rate and aggravate the performance of speaker recognition system. This paper deals with this problem by introducing a cost-sensitive learning technology to reweight the probability of test affective utterances in the pitch envelop level, which can enhance the robustness in emotion-dependent speaker recognition effectively. Based on that technology, a new architecture of recognition system as well as its components is proposed in this paper. The experiment conducted on the Mandarin Affective Speech Corpus shows that an improvement of 8% identification rate over the traditional speaker recognition is achieved.
Cost-Sensitive Learning for Emotion Robust Speaker Recognition
Li, Dongdong; Yang, Yingchun
2014-01-01
In the field of information security, voice is one of the most important parts in biometrics. Especially, with the development of voice communication through the Internet or telephone system, huge voice data resources are accessed. In speaker recognition, voiceprint can be applied as the unique password for the user to prove his/her identity. However, speech with various emotions can cause an unacceptably high error rate and aggravate the performance of speaker recognition system. This paper deals with this problem by introducing a cost-sensitive learning technology to reweight the probability of test affective utterances in the pitch envelop level, which can enhance the robustness in emotion-dependent speaker recognition effectively. Based on that technology, a new architecture of recognition system as well as its components is proposed in this paper. The experiment conducted on the Mandarin Affective Speech Corpus shows that an improvement of 8% identification rate over the traditional speaker recognition is achieved. PMID:24999492
Coleman, Jonathan R.I.; Lester, Kathryn J.; Keers, Robert; Munafò, Marcus R.; Breen, Gerome
2017-01-01
Emotion recognition is disrupted in many mental health disorders, which may reflect shared genetic aetiology between this trait and these disorders. We explored genetic influences on emotion recognition and the relationship between these influences and mental health phenotypes. Eight‐year‐old participants (n = 4,097) from the Avon Longitudinal Study of Parents and Children (ALSPAC) completed the Diagnostic Analysis of Non‐Verbal Accuracy (DANVA) faces test. Genome‐wide genotype data was available from the Illumina HumanHap550 Quad microarray. Genome‐wide association studies were performed to assess associations with recognition of individual emotions and emotion in general. Exploratory polygenic risk scoring was performed using published genomic data for schizophrenia, bipolar disorder, depression, autism spectrum disorder, anorexia, and anxiety disorders. No individual genetic variants were identified at conventional levels of significance in any analysis although several loci were associated at a level suggestive of significance. SNP‐chip heritability analyses did not identify a heritable component of variance for any phenotype. Polygenic scores were not associated with any phenotype. The effect sizes of variants influencing emotion recognition are likely to be small. Previous studies of emotion identification have yielded non‐zero estimates of SNP‐heritability. This discrepancy is likely due to differences in the measurement and analysis of the phenotype. PMID:28608620
Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.
Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko
2008-01-01
The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.
Duque, Aránzazu; Vinader-Caerols, Concepción; Monleón, Santiago
2017-01-01
We have previously observed the impairing effects of chronic social defeat stress (CSDS) on emotional memory in mice. Given the relation between stress and inflammatory processes, we sought to study the effectiveness of the anti-inflammatory indomethacin in reversing the detrimental effects of CSDS on emotional memory in mice. The effects of CSDS and indomethacin on recognition memory were also evaluated. Male CD1 mice were randomly divided into four groups: non-stressed + saline (NS+SAL); non-stressed + indomethacin (NS+IND); stressed + saline (S+SAL); and stressed + indomethacin (S+IND). Stressed animals were exposed to a daily 10 min agonistic confrontation (CSDS) for 20 days. All subjects were treated daily with saline or indomethacin (10 mg/kg, i.p.). 24 h after the CSDS period, all the mice were evaluated in a social interaction test to distinguish between those that were resilient or susceptible to social stress. All subjects (n = 10-12 per group) were then evaluated in inhibitory avoidance (IA), novel object recognition (NOR), elevated plus maze and hot plate tests. As in control animals (NS+SAL group), IA learning was observed in the resilient groups, as well as in the susceptible mice treated with indomethacin (S+IND group). Recognition memory was observed in the non-stressed and the resilient mice, but not in the susceptible animals. Also, stressed mice exhibited higher anxiety levels. No significant differences were observed in locomotor activity or analgesia. In conclusion, CSDS induces anxiety in post-pubertal mice and impairs emotional and recognition memory in the susceptible subjects. The effects of CSDS on emotional memory, but not on recognition memory and anxiety, are reversed by indomethacin. Moreover, memory impairment is not secondary to the effects of CSDS on locomotor activity, emotionality or pain sensitivity.
Duque, Aránzazu; Vinader-Caerols, Concepción
2017-01-01
We have previously observed the impairing effects of chronic social defeat stress (CSDS) on emotional memory in mice. Given the relation between stress and inflammatory processes, we sought to study the effectiveness of the anti-inflammatory indomethacin in reversing the detrimental effects of CSDS on emotional memory in mice. The effects of CSDS and indomethacin on recognition memory were also evaluated. Male CD1 mice were randomly divided into four groups: non-stressed + saline (NS+SAL); non-stressed + indomethacin (NS+IND); stressed + saline (S+SAL); and stressed + indomethacin (S+IND). Stressed animals were exposed to a daily 10 min agonistic confrontation (CSDS) for 20 days. All subjects were treated daily with saline or indomethacin (10 mg/kg, i.p.). 24 h after the CSDS period, all the mice were evaluated in a social interaction test to distinguish between those that were resilient or susceptible to social stress. All subjects (n = 10–12 per group) were then evaluated in inhibitory avoidance (IA), novel object recognition (NOR), elevated plus maze and hot plate tests. As in control animals (NS+SAL group), IA learning was observed in the resilient groups, as well as in the susceptible mice treated with indomethacin (S+IND group). Recognition memory was observed in the non-stressed and the resilient mice, but not in the susceptible animals. Also, stressed mice exhibited higher anxiety levels. No significant differences were observed in locomotor activity or analgesia. In conclusion, CSDS induces anxiety in post-pubertal mice and impairs emotional and recognition memory in the susceptible subjects. The effects of CSDS on emotional memory, but not on recognition memory and anxiety, are reversed by indomethacin. Moreover, memory impairment is not secondary to the effects of CSDS on locomotor activity, emotionality or pain sensitivity. PMID:28278165
The coupling of emotion and cognition in the eye: introducing the pupil old/new effect.
Võ, Melissa L-H; Jacobs, Arthur M; Kuchinke, Lars; Hofmann, Markus; Conrad, Markus; Schacht, Annekathrin; Hutzler, Florian
2008-01-01
The study presented here investigated the effects of emotional valence on the memory for words by assessing both memory performance and pupillary responses during a recognition memory task. Participants had to make speeded judgments on whether a word presented in the test phase of the experiment had already been presented ("old") or not ("new"). An emotion-induced recognition bias was observed: Words with emotional content not only produced a higher amount of hits, but also elicited more false alarms than neutral words. Further, we found a distinct pupil old/new effect characterized as an elevated pupillary response to hits as opposed to correct rejections. Interestingly, this pupil old/new effect was clearly diminished for emotional words. We therefore argue that the pupil old/new effect is not only able to mirror memory retrieval processes, but also reflects modulation by an emotion-induced recognition bias.
Interplay between affect and arousal in recognition memory.
Greene, Ciara M; Bahri, Pooja; Soto, David
2010-07-23
Emotional states linked to arousal and mood are known to affect the efficiency of cognitive performance. However, the extent to which memory processes may be affected by arousal, mood or their interaction is poorly understood. Following a study phase of abstract shapes, we altered the emotional state of participants by means of exposure to music that varied in both mood and arousal dimensions, leading to four different emotional states: (i) positive mood-high arousal; (ii) positive mood-low arousal; (iii) negative mood-high arousal; (iv) negative mood-low arousal. Following the emotional induction, participants performed a memory recognition test. Critically, there was an interaction between mood and arousal on recognition performance. Memory was enhanced in the positive mood-high arousal and in the negative mood-low arousal states, relative to the other emotional conditions. Neither mood nor arousal alone but their interaction appears most critical to understanding the emotional enhancement of memory.
Development of emotional facial recognition in late childhood and adolescence.
Thomas, Laura A; De Bellis, Michael D; Graham, Reiko; LaBar, Kevin S
2007-09-01
The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents and adults on a two-alternative forced-choice discrimination task using morphed faces that varied in emotional content. Actors appeared to pose expressions that changed incrementally along three progressions: neutral-to-fear, neutral-to-anger, and fear-to-anger. Across all three morph types, adults displayed more sensitivity to subtle changes in emotional expression than children and adolescents. Fear morphs and fear-to-anger blends showed a linear developmental trajectory, whereas anger morphs showed a quadratic trend, increasing sharply from adolescents to adults. The results provide evidence for late developmental changes in emotional expression recognition with some specificity in the time course for distinct emotions.
Music to my ears: Age-related decline in musical and facial emotion recognition.
Sutcliffe, Ryan; Rendell, Peter G; Henry, Julie D; Bailey, Phoebe E; Ruffman, Ted
2017-12-01
We investigated young-old differences in emotion recognition using music and face stimuli and tested explanatory hypotheses regarding older adults' typically worse emotion recognition. In Experiment 1, young and older adults labeled emotions in an established set of faces, and in classical piano stimuli that we pilot-tested on other young and older adults. Older adults were worse at detecting anger, sadness, fear, and happiness in music. Performance on the music and face emotion tasks was not correlated for either age group. Because musical expressions of fear were not equated for age groups in the pilot study of Experiment 1, we conducted a second experiment in which we created a novel set of music stimuli that included more accessible musical styles, and which we again pilot-tested on young and older adults. In this pilot study, all musical emotions were identified similarly by young and older adults. In Experiment 2, participants also made age estimations in another set of faces to examine whether potential relations between the face and music emotion tasks would be shared with the age estimation task. Older adults did worse in each of the tasks, and had specific difficulty recognizing happy, sad, peaceful, angry, and fearful music clips. Older adults' difficulties in each of the 3 tasks-music emotion, face emotion, and face age-were not correlated with each other. General cognitive decline did not appear to explain our results as increasing age predicted emotion performance even after fluid IQ was controlled for within the older adult group. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
"We all look the same to me": positive emotions eliminate the own-race in face recognition.
Johnson, Kareem J; Fredrickson, Barbara L
2005-11-01
Extrapolating from the broaden-and-build theory, we hypothesized that positive emotion may reduce the own-race bias in facial recognition. In Experiments 1 and 2, Caucasian participants (N = 89) viewed Black and White faces for a recognition task. They viewed videos eliciting joy, fear, or neutrality before the learning (Experiment 1) or testing (Experiment 2) stages of the task. Results reliably supported the hypothesis. Relative to fear or a neutral state, joy experienced before either stage improved recognition of Black faces and significantly reduced the own-race bias. Discussion centers on possible mechanisms for this reduction of the own-race bias, including improvements in holistic processing and promotion of a common in-group identity due to positive emotions.
The Development of Emotion Recognition in Individuals with Autism
ERIC Educational Resources Information Center
Rump, Keiran M.; Giovannelli, Joyce L.; Minshew, Nancy J.; Strauss, Mark S.
2009-01-01
Emotion recognition was investigated in typically developing individuals and individuals with autism. Experiment 1 tested children (5-7 years, n = 37) with brief video displays of facial expressions that varied in subtlety. Children with autism performed worse than the control children. In Experiment 2, 3 age groups (8-12 years, n = 49; 13-17…
Time course of effects of emotion on item memory and source memory for Chinese words.
Wang, Bo; Fu, Xiaolan
2011-05-01
Although many studies have investigated the effect of emotion on memory, it is unclear whether the effect of emotion extends to all aspects of an event. In addition, it is poorly understood how effects of emotion on item memory and source memory change over time. This study examined the time course of effects of emotion on item memory and source memory. Participants learned intentionally a list of neutral, positive, and negative Chinese words, which were presented twice, and then took test of free recall, followed by recognition and source memory tests, at one of eight delayed points of time. The main findings are (within the time frame of 2 weeks): (1) Negative emotion enhances free recall, whereas there is only a trend that positive emotion enhances free recall. In addition, negative and positive emotions have different points of time at which their effects on free recall reach the greatest magnitude. (2) Negative emotion reduces recognition, whereas positive emotion has no effect on recognition. (3) Neither positive nor negative emotion has any effect on source memory. The above findings indicate that effect of emotion does not necessarily extend to all aspects of an event and that valence is a critical modulating factor in effect of emotion on item memory. Furthermore, emotion does not affect the time course of item memory and source memory, at least with a time frame of 2 weeks. This study has implications for establishing the theoretical model regarding the effect of emotion on memory. Copyright © 2011 Elsevier Inc. All rights reserved.
Facial recognition in primary focal dystonia.
Rinnerthaler, Martina; Benecke, Cord; Bartha, Lisa; Entner, Tanja; Poewe, Werner; Mueller, Joerg
2006-01-01
The basal ganglia seem to be involved in emotional processing. Primary dystonia is a movement disorder considered to result from basal ganglia dysfunction, and the aim of the present study was to investigate emotion recognition in patients with primary focal dystonia. Thirty-two patients with primary cranial (n=12) and cervical (n=20) dystonia were compared to 32 healthy controls matched for age, sex, and educational level on the facially expressed emotion labeling (FEEL) test, a computer-based tool measuring a person's ability to recognize facially expressed emotions. Patients with cognitive impairment or depression were excluded. None of the patients received medication with a possible cognitive side effect profile and only those with mild to moderate dystonia were included. Patients with primary dystonia showed isolated deficits in the recognition of disgust (P=0.007), while no differences between patients and controls were found with regard to the other emotions (fear, happiness, surprise, sadness, and anger). The findings of the present study add further evidence to the conception that dystonia is not only a motor but a complex basal ganglia disorder including selective emotion recognition disturbances. Copyright (c) 2005 Movement Disorder Society.
Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.
Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S
2007-01-01
People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.
Emotion recognition from speech: tools and challenges
NASA Astrophysics Data System (ADS)
Al-Talabani, Abdulbasit; Sellahewa, Harin; Jassim, Sabah A.
2015-05-01
Human emotion recognition from speech is studied frequently for its importance in many applications, e.g. human-computer interaction. There is a wide diversity and non-agreement about the basic emotion or emotion-related states on one hand and about where the emotion related information lies in the speech signal on the other side. These diversities motivate our investigations into extracting Meta-features using the PCA approach, or using a non-adaptive random projection RP, which significantly reduce the large dimensional speech feature vectors that may contain a wide range of emotion related information. Subsets of Meta-features are fused to increase the performance of the recognition model that adopts the score-based LDC classifier. We shall demonstrate that our scheme outperform the state of the art results when tested on non-prompted databases or acted databases (i.e. when subjects act specific emotions while uttering a sentence). However, the huge gap between accuracy rates achieved on the different types of datasets of speech raises questions about the way emotions modulate the speech. In particular we shall argue that emotion recognition from speech should not be dealt with as a classification problem. We shall demonstrate the presence of a spectrum of different emotions in the same speech portion especially in the non-prompted data sets, which tends to be more "natural" than the acted datasets where the subjects attempt to suppress all but one emotion.
Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M
2014-12-01
Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.
Visual Scanning Patterns and Executive Function in Relation to Facial Emotion Recognition in Aging
Circelli, Karishma S.; Clark, Uraina S.; Cronin-Golomb, Alice
2012-01-01
Objective The ability to perceive facial emotion varies with age. Relative to younger adults (YA), older adults (OA) are less accurate at identifying fear, anger, and sadness, and more accurate at identifying disgust. Because different emotions are conveyed by different parts of the face, changes in visual scanning patterns may account for age-related variability. We investigated the relation between scanning patterns and recognition of facial emotions. Additionally, as frontal-lobe changes with age may affect scanning patterns and emotion recognition, we examined correlations between scanning parameters and performance on executive function tests. Methods We recorded eye movements from 16 OA (mean age 68.9) and 16 YA (mean age 19.2) while they categorized facial expressions and non-face control images (landscapes), and administered standard tests of executive function. Results OA were less accurate than YA at identifying fear (p<.05, r=.44) and more accurate at identifying disgust (p<.05, r=.39). OA fixated less than YA on the top half of the face for disgust, fearful, happy, neutral, and sad faces (p’s<.05, r’s≥.38), whereas there was no group difference for landscapes. For OA, executive function was correlated with recognition of sad expressions and with scanning patterns for fearful, sad, and surprised expressions. Conclusion We report significant age-related differences in visual scanning that are specific to faces. The observed relation between scanning patterns and executive function supports the hypothesis that frontal-lobe changes with age may underlie some changes in emotion recognition. PMID:22616800
Effects of exposure to facial expression variation in face learning and recognition.
Liu, Chang Hong; Chen, Wenfeng; Ward, James
2015-11-01
Facial expression is a major source of image variation in face images. Linking numerous expressions to the same face can be a huge challenge for face learning and recognition. It remains largely unknown what level of exposure to this image variation is critical for expression-invariant face recognition. We examined this issue in a recognition memory task, where the number of facial expressions of each face being exposed during a training session was manipulated. Faces were either trained with multiple expressions or a single expression, and they were later tested in either the same or different expressions. We found that recognition performance after learning three emotional expressions had no improvement over learning a single emotional expression (Experiments 1 and 2). However, learning three emotional expressions improved recognition compared to learning a single neutral expression (Experiment 3). These findings reveal both the limitation and the benefit of multiple exposures to variations of emotional expression in achieving expression-invariant face recognition. The transfer of expression training to a new type of expression is likely to depend on a relatively extensive level of training and a certain degree of variation across the types of expressions.
Verdejo-García, Antonio; Albein-Urios, Natalia; Molina, Esther; Ching-López, Ana; Martínez-González, José M; Gutiérrez, Blanca
2013-11-01
Based on previous evidence of a MAOA gene*cocaine use interaction on orbitofrontal cortex volume attrition, we tested whether the MAOA low activity variant and cocaine use severity are interactively associated with impulsivity and behavioral indices of orbitofrontal dysfunction: emotion recognition and decision-making. 72 cocaine dependent individuals and 52 non-drug using controls (including healthy individuals and problem gamblers) were genotyped for the MAOA gene and tested using the UPPS-P Impulsive Behavior Scale, the Iowa Gambling Task and the Ekman's Facial Emotions Recognition Test. To test the main hypothesis, we conducted hierarchical multiple regression analyses including three sets of predictors: (1) age, (2) MAOA genotype and severity of cocaine use, and (3) the interaction between MAOA genotype and severity of cocaine use. UPPS-P, Ekman Test and Iowa Gambling Task's scores were the outcome measures. We computed the statistical significance of the prediction change yielded by each consecutive set, with 'a priori' interest in the MAOA*cocaine severity interaction. We found significant effects of the MAOA gene*cocaine use severity interaction on the emotion recognition scores and the UPPS-P's dimensions of Positive Urgency and Sensation Seeking: Low activity carriers with higher cocaine exposure had poorer emotion recognition and higher Positive Urgency and Sensation Seeking. Cocaine users carrying the MAOA low activity show a greater impact of cocaine use on impulsivity and behavioral measures of orbitofrontal cortex dysfunction. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Social appraisal influences recognition of emotions.
Mumenthaler, Christian; Sander, David
2012-06-01
The notion of social appraisal emphasizes the importance of a social dimension in appraisal theories of emotion by proposing that the way an individual appraises an event is influenced by the way other individuals appraise and feel about the same event. This study directly tested this proposal by asking participants to recognize dynamic facial expressions of emotion (fear, happiness, or anger in Experiment 1; fear, happiness, anger, or neutral in Experiment 2) in a target face presented at the center of a screen while a contextual face, which appeared simultaneously in the periphery of the screen, expressed an emotion (fear, happiness, anger) or not (neutral) and either looked at the target face or not. We manipulated gaze direction to be able to distinguish between a mere contextual effect (gaze away from both the target face and the participant) and a specific social appraisal effect (gaze toward the target face). Results of both experiments provided evidence for a social appraisal effect in emotion recognition, which differed from the mere effect of contextual information: Whereas facial expressions were identical in both conditions, the direction of the gaze of the contextual face influenced emotion recognition. Social appraisal facilitated the recognition of anger, happiness, and fear when the contextual face expressed the same emotion. This facilitation was stronger than the mere contextual effect. Social appraisal also allowed better recognition of fear when the contextual face expressed anger and better recognition of anger when the contextual face expressed fear. 2012 APA, all rights reserved
Spikman, Jacoba M; Milders, Maarten V; Visser-Keizer, Annemarie C; Westerhof-Evers, Herma J; Herben-Dekker, Meike; van der Naalt, Joukje
2013-01-01
Traumatic brain injury (TBI) is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and many patients provide unrealistically positive reports of their functioning due to impaired self-awareness. Accordingly, it is important to find performance based tests that allow objective and early identification of these problems. In the present study 51 moderate to severe TBI patients in the sub-acute and chronic stage were assessed with a test for emotion recognition (FEEST) and a questionnaire for behavioral problems (DEX) with a self and proxy rated version. Patients performed worse on the total score and on the negative emotion subscores of the FEEST than a matched group of 31 healthy controls. Patients also exhibited significantly more behavioral problems on both the DEX self and proxy rated version, but proxy ratings revealed more severe problems. No significant correlation was found between FEEST scores and DEX self ratings. However, impaired emotion recognition in the patients, and in particular of Sadness and Anger, was significantly correlated with behavioral problems as rated by proxies and with impaired self-awareness. This is the first study to find these associations, strengthening the proposed recognition of social signals as a condition for adequate social functioning. Hence, deficits in emotion recognition can be conceived as markers for behavioral problems and lack of insight in TBI patients. This finding is also of clinical importance since, unlike behavioral problems, emotion recognition can be objectively measured early after injury, allowing for early detection and treatment of these problems.
Effect of Time Delay on Recognition Memory for Pictures: The Modulatory Role of Emotion
Wang, Bo
2014-01-01
This study investigated the modulatory role of emotion in the effect of time delay on recognition memory for pictures. Participants viewed neutral, positive and negative pictures, and took a recognition memory test 5 minutes, 24 hours, or 1 week after learning. The findings are: 1) For neutral, positive and negative pictures, overall recognition accuracy in the 5-min delay did not significantly differ from that in the 24-h delay. For neutral and positive pictures, overall recognition accuracy in the 1-week delay was lower than in the 24-h delay; for negative pictures, overall recognition in the 24-h and 1-week delay did not significantly differ. Therefore negative emotion modulates the effect of time delay on recognition memory, maintaining retention of overall recognition accuracy only within a certain frame of time. 2) For the three types of pictures, recollection and familiarity in the 5-min delay did not significantly differ from that in the 24-h and the 1-week delay. Thus emotion does not appear to modulate the effect of time delay on recollection and familiarity. However, recollection in the 24-h delay was higher than in the 1-week delay, whereas familiarity in the 24-h delay was lower than in the 1-week delay. PMID:24971457
Dissociation between facial and bodily expressions in emotion recognition: A case study.
Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo
2017-12-21
Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.
Facial emotion recognition ability: psychiatry nurses versus nurses from other departments.
Gultekin, Gozde; Kincir, Zeliha; Kurt, Merve; Catal, Yasir; Acil, Asli; Aydin, Aybike; Özcan, Mualla; Delikkaya, Busra N; Kacar, Selma; Emul, Murat
2016-12-01
Facial emotion recognition is a basic element in non-verbal communication. Although some researchers have shown that recognizing facial expressions may be important in the interaction between doctors and patients, there are no studies concerning facial emotion recognition in nurses. Here, we aimed to investigate facial emotion recognition ability in nurses and compare the abilities between nurses from psychiatry and other departments. In this cross-sectional study, sixty seven nurses were divided into two groups according to their departments: psychiatry (n=31); and, other departments (n=36). A Facial Emotion Recognition Test, constructed from a set of photographs from Ekman and Friesen's book "Pictures of Facial Affect", was administered to all participants. In whole group, the highest mean accuracy rate of recognizing facial emotion was the happy (99.14%) while the lowest accurately recognized facial expression was fear (47.71%). There were no significant differences between two groups among mean accuracy rates in recognizing happy, sad, fear, angry, surprised facial emotion expressions (for all, p>0.05). The ability of recognizing disgusted and neutral facial emotions tended to be better in other nurses than psychiatry nurses (p=0.052 and p=0.053, respectively) Conclusion: This study was the first that revealed indifference in the ability of FER between psychiatry nurses and non-psychiatry nurses. In medical education curricula throughout the world, no specific training program is scheduled for recognizing emotional cues of patients. We considered that improving the ability of recognizing facial emotion expression in medical stuff might be beneficial in reducing inappropriate patient-medical stuff interaction.
Yao, Shih-Ying; Bull, Rebecca; Khng, Kiat Hui; Rahim, Anisa
2018-01-01
Understanding a child's ability to decode emotion expressions is important to allow early interventions for potential difficulties in social and emotional functioning. This study applied the Rasch model to investigate the psychometric properties of the NEPSY-II Affect Recognition subtest, a U.S. normed measure for 3-16 year olds which assesses the ability to recognize facial expressions of emotion. Data were collected from 1222 children attending preschools in Singapore. We first performed the Rasch analysis with the raw item data, and examined the technical qualities and difficulty pattern of the studied items. We subsequently investigated the relation of the estimated affect recognition ability from the Rasch analysis to a teacher-reported measure of a child's behaviors, emotions, and relationships. Potential gender differences were also examined. The Rasch model fits our data well. Also, the NEPSY-II Affect Recognition subtest was found to have reasonable technical qualities, expected item difficulty pattern, and desired association with the external measure of children's behaviors, emotions, and relationships for both boys and girls. Overall, findings from this study suggest that the NEPSY-II Affect Recognition subtest is a promising measure of young children's affect recognition ability. Suggestions for future test improvement and research were discussed.
Rigon, Arianna; Turkstra, Lyn; Mutlu, Bilge; Duff, Melissa
2018-01-01
Although moderate to severe traumatic brain injury (TBI) leads to facial affect recognition impairments in up to 39% of individuals, protective and risk factors for these deficits are unknown. The aim of the current study was to examine the effect of sex on emotion recognition abilities following TBI. We administered two separate emotion recognition tests (one static and one dynamic) to 53 individuals with moderate to severe TBI (Females=28) and 49 demographically matched comparisons (Females=22). We then investigated the presence of a sex-by-group interaction in emotion recognition accuracy. In the comparison group, there were no sex differences. In the TBI group, however, females significantly outperformed males in the dynamic (but not the static) task. Moreover, males (but not females) with TBI performed significantly worse than comparison participants in the dynamic task. Further analysis revealed that sex differences in emotion recognition abilities within the TBI group could not be explained by lesion location, TBI severity, or other neuropsychological variables. These findings suggest that sex may serve as a protective factor for social impairment following TBI and inform clinicians working with TBI as well as research on the neurophysiological correlates of sex differences in social functioning. PMID:27245826
CACNA1C risk variant affects facial emotion recognition in healthy individuals.
Nieratschker, Vanessa; Brückmann, Christof; Plewnia, Christian
2015-11-27
Recognition and correct interpretation of facial emotion is essential for social interaction and communication. Previous studies have shown that impairments in this cognitive domain are common features of several psychiatric disorders. Recent association studies identified CACNA1C as one of the most promising genetic risk factors for psychiatric disorders and previous evidence suggests that the most replicated risk variant in CACNA1C (rs1006737) is affecting emotion recognition and processing. However, studies investigating the influence of rs1006737 on this intermediate phenotype in healthy subjects at the behavioral level are largely missing to date. Here, we applied the "Reading the Mind in the Eyes" test, a facial emotion recognition paradigm in a cohort of 92 healthy individuals to address this question. Whereas accuracy was not affected by genotype, CACNA1C rs1006737 risk-allele carries (AA/AG) showed significantly slower mean response times compared to individuals homozygous for the G-allele, indicating that healthy risk-allele carriers require more information to correctly identify a facial emotion. Our study is the first to provide evidence for an impairing behavioral effect of the CACNA1C risk variant rs1006737 on facial emotion recognition in healthy individuals and adds to the growing number of studies pointing towards CACNA1C as affecting intermediate phenotypes of psychiatric disorders.
Evaluating deep learning architectures for Speech Emotion Recognition.
Fayek, Haytham M; Lech, Margaret; Cavedon, Lawrence
2017-08-01
Speech Emotion Recognition (SER) can be regarded as a static or dynamic classification problem, which makes SER an excellent test bed for investigating and comparing various deep learning architectures. We describe a frame-based formulation to SER that relies on minimal speech processing and end-to-end deep learning to model intra-utterance dynamics. We use the proposed SER system to empirically explore feed-forward and recurrent neural network architectures and their variants. Experiments conducted illuminate the advantages and limitations of these architectures in paralinguistic speech recognition and emotion recognition in particular. As a result of our exploration, we report state-of-the-art results on the IEMOCAP database for speaker-independent SER and present quantitative and qualitative assessments of the models' performances. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sleep facilitates consolidation of emotional declarative memory.
Hu, Peter; Stylos-Allan, Melinda; Walker, Matthew P
2006-10-01
Both sleep and emotion are known to modulate processes of memory consolidation, yet their interaction is poorly understood. We examined the influence of sleep on consolidation of emotionally arousing and neutral declarative memory. Subjects completed an initial study session involving arousing and neutral pictures, either in the evening or in the morning. Twelve hours later, after sleeping or staying awake, subjects performed a recognition test requiring them to discriminate between these original pictures and novel pictures by responding "remember,"know" (familiar), or "new." Selective sleep effects were observed for consolidation of emotional memory: Recognition accuracy for know judgments of arousing stimuli improved by 42% after sleep relative to wake, and recognition bias for remember judgments of these stimuli increased by 58% after sleep relative to wake (resulting in more conservative responding). These findings hold important implications for understanding of human memory processing, suggesting that the facilitation of memory for emotionally salient information may preferentially develop during sleep.
[Emotion Recognition in Patients with Peripheral Facial Paralysis - A Pilot Study].
Konnerth, V; Mohr, G; von Piekartz, H
2016-02-01
The perception of emotions is an important component in enabling human beings to social interaction in everyday life. Thus, the ability to recognize the emotions of the other one's mime is a key prerequisite for this. The following study aimed at evaluating the ability of subjects with 'peripheral facial paresis' to perceive emotions in healthy individuals. A pilot study was conducted in which 13 people with 'peripheral facial paresis' participated. This assessment included the 'Facially Expressed Emotion Labeling-Test' (FEEL-Test), the 'Facial-Laterality-Recognition Test' (FLR-Test) and the 'Toronto-Alexithymie-Scale 26' (TAS 26). The results were compared with data of healthy people from other studies. In contrast to healthy patients, the subjects with 'facial paresis' show more difficulties in recognizing basic emotions; however the results are not significant. The participants show a significant lower level of speed (right/left: p<0.001) concerning the perception of facial laterality compared to healthy people. With regard to the alexithymia, the tested group reveals significantly higher results (p<0.001) compared to the unimpaired people. The present pilot study does not prove any impact on this specific patient group's ability to recognize emotions and facial laterality. For future studies the research question should be verified in a larger sample size. © Georg Thieme Verlag KG Stuttgart · New York.
Enrici, Ivan; Adenzato, Mauro; Ardito, Rita B.; Mitkova, Antonia; Cavallo, Marco; Zibetti, Maurizio; Lopiano, Leonardo; Castelli, Lorys
2015-01-01
Background Parkinson’s disease (PD) is characterised by well-known motor symptoms, whereas the presence of cognitive non-motor symptoms, such as emotional disturbances, is still underestimated. One of the major problems in studying emotion deficits in PD is an atomising approach that does not take into account different levels of emotion elaboration. Our study addressed the question of whether people with PD exhibit difficulties in one or more specific dimensions of emotion processing, investigating three different levels of analyses, that is, recognition, representation, and regulation. Methodology Thirty-two consecutive medicated patients with PD and 25 healthy controls were enrolled in the study. Participants performed a three-level analysis assessment of emotional processing using quantitative standardised emotional tasks: the Ekman 60-Faces for emotion recognition, the full 36-item version of the Reading the Mind in the Eyes (RME) for emotion representation, and the 20-item Toronto Alexithymia Scale (TAS-20) for emotion regulation. Principal Findings Regarding emotion recognition, patients obtained significantly worse scores than controls in the total score of Ekman 60-Faces but not in any other basic emotions. For emotion representation, patients obtained significantly worse scores than controls in the RME experimental score but no in the RME gender control task. Finally, on emotion regulation, PD and controls did not perform differently at TAS-20 and no specific differences were found on TAS-20 subscales. The PD impairments on emotion recognition and representation do not correlate with dopamine therapy, disease severity, or with the duration of illness. These results are independent from other cognitive processes, such as global cognitive status and executive function, or from psychiatric status, such as depression, anxiety or apathy. Conclusions These results may contribute to better understanding of the emotional problems that are often seen in patients with PD and the measures used to test these problems, in particular on the use of different versions of the RME task. PMID:26110271
Martinelli, Eugenio; Mencattini, Arianna; Daprati, Elena; Di Natale, Corrado
2016-01-01
Humans can communicate their emotions by modulating facial expressions or the tone of their voice. Albeit numerous applications exist that enable machines to read facial emotions and recognize the content of verbal messages, methods for speech emotion recognition are still in their infancy. Yet, fast and reliable applications for emotion recognition are the obvious advancement of present 'intelligent personal assistants', and may have countless applications in diagnostics, rehabilitation and research. Taking inspiration from the dynamics of human group decision-making, we devised a novel speech emotion recognition system that applies, for the first time, a semi-supervised prediction model based on consensus. Three tests were carried out to compare this algorithm with traditional approaches. Labeling performances relative to a public database of spontaneous speeches are reported. The novel system appears to be fast, robust and less computationally demanding than traditional methods, allowing for easier implementation in portable voice-analyzers (as used in rehabilitation, research, industry, etc.) and for applications in the research domain (such as real-time pairing of stimuli to participants' emotional state, selective/differential data collection based on emotional content, etc.).
The development of cross-cultural recognition of vocal emotion during childhood and adolescence.
Chronaki, Georgia; Wigelsworth, Michael; Pell, Marc D; Kotz, Sonja A
2018-06-14
Humans have an innate set of emotions recognised universally. However, emotion recognition also depends on socio-cultural rules. Although adults recognise vocal emotions universally, they identify emotions more accurately in their native language. We examined developmental trajectories of universal vocal emotion recognition in children. Eighty native English speakers completed a vocal emotion recognition task in their native language (English) and foreign languages (Spanish, Chinese, and Arabic) expressing anger, happiness, sadness, fear, and neutrality. Emotion recognition was compared across 8-to-10, 11-to-13-year-olds, and adults. Measures of behavioural and emotional problems were also taken. Results showed that although emotion recognition was above chance for all languages, native English speaking children were more accurate in recognising vocal emotions in their native language. There was a larger improvement in recognising vocal emotion from the native language during adolescence. Vocal anger recognition did not improve with age for the non-native languages. This is the first study to demonstrate universality of vocal emotion recognition in children whilst supporting an "in-group advantage" for more accurate recognition in the native language. Findings highlight the role of experience in emotion recognition, have implications for child development in modern multicultural societies and address important theoretical questions about the nature of emotions.
Effects of Emotion on Associative Recognition: Valence and Retention Interval Matter
Pierce, Benton H.; Kensinger, Elizabeth A.
2011-01-01
In two experiments, we examined the effects of emotional valence and arousal on associative binding. Participants studied negative, positive, and neutral word pairs, followed by an associative recognition test. In Experiment 1, with a short-delayed test, accuracy for intact pairs was equivalent across valences, whereas accuracy for rearranged pairs was lower for negative than for positive and neutral pairs. In Experiment 2, we tested participants after a one-week delay and found that accuracy was greater for intact negative than for intact neutral pairs, whereas rearranged pair accuracy was equivalent across valences. These results suggest that, although negative emotional valence impairs associative binding after a short delay, it may improve binding after a longer delay. The results also suggest that valence, as well as arousal, needs to be considered when examining the effects of emotion on associative memory. PMID:21401233
Isaacowitz, Derek M.; Stanley, Jennifer Tehan
2011-01-01
Older adults perform worse on traditional tests of emotion recognition accuracy than do young adults. In this paper, we review descriptive research to date on age differences in emotion recognition from facial expressions, as well as the primary theoretical frameworks that have been offered to explain these patterns. We propose that this is an area of inquiry that would benefit from an ecological approach in which contextual elements are more explicitly considered and reflected in experimental methods. Use of dynamic displays and examination of specific cues to accuracy, for example, may reveal more nuanced age-related patterns and may suggest heretofore unexplored underlying mechanisms. PMID:22125354
The role of attention at retrieval on the false recognition of negative emotional DRM lists.
Shah, Datin; Knott, Lauren M
2018-02-01
This study examined the role of attention at retrieval on the false recognition of emotional items using the Deese-Roediger-McDermott (DRM) paradigm. Previous research has shown that divided attention at test increases false remember judgements for neutral critical lures. However, no research has yet directly assessed emotional false memories when attention is manipulated at retrieval. To examine this, participants studied negative (low in valence and high in arousal) and neutral DRM lists and completed recognition tests under conditions of full and divided attention. Results revealed that divided attention at retrieval increased false remember judgements for all critical lures compared to retrieval under full attention, but in both retrieval conditions, false memories were greater for negative compared to neutral stimuli. We believe that this is due to reliance on a more easily accessible (meaning of the word) but less diagnostic form of source monitoring, amplified under conditions of divided attention.
Effects of hydrocortisone on false memory recognition in healthy men and women.
Duesenberg, Moritz; Weber, Juliane; Schaeuffele, Carmen; Fleischer, Juliane; Hellmann-Regen, Julian; Roepke, Stefan; Moritz, Steffen; Otte, Christian; Wingenfeld, Katja
2016-12-01
Most of the studies focusing on the effect of stress on false memories by using psychosocial and physiological stressors yielded diverse results. In the present study, we systematically tested the effect of exogenous hydrocortisone using a false memory paradigm. In this placebo-controlled study, 37 healthy men and 38 healthy women (mean age 24.59 years) received either 10 mg of hydrocortisone or placebo 75 min before using the false memory, that is, Deese-Roediger-McDermott (DRM), paradigm. We used emotionally charged and neutral DRM-based word lists to look for false recognition rates in comparison to true recognition rates. Overall, we expected an increase in false memory after hydrocortisone compared to placebo. No differences between the cortisol and the placebo group were revealed for false and for true recognition performance. In general, false recognition rates were lower compared to true recognition rates. Furthermore, we found a valence effect (neutral, positive, negative, disgust word stimuli), indicating higher rates of true and false recognition for emotional compared to neutral words. We further found an interaction effect between sex and recognition. Post hoc t tests showed that for true recognition women showed a significantly better memory performance than men, independent of treatment. This study does not support the hypothesis that cortisol decreases the ability to distinguish between old versus novel words in young healthy individuals. However, sex and emotional valence of word stimuli appear to be important moderators. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Textual emotion recognition for enhancing enterprise computing
NASA Astrophysics Data System (ADS)
Quan, Changqin; Ren, Fuji
2016-05-01
The growing interest in affective computing (AC) brings a lot of valuable research topics that can meet different application demands in enterprise systems. The present study explores a sub area of AC techniques - textual emotion recognition for enhancing enterprise computing. Multi-label emotion recognition in text is able to provide a more comprehensive understanding of emotions than single label emotion recognition. A representation of 'emotion state in text' is proposed to encompass the multidimensional emotions in text. It ensures the description in a formal way of the configurations of basic emotions as well as of the relations between them. Our method allows recognition of the emotions for the words bear indirect emotions, emotion ambiguity and multiple emotions. We further investigate the effect of word order for emotional expression by comparing the performances of bag-of-words model and sequence model for multi-label sentence emotion recognition. The experiments show that the classification results under sequence model are better than under bag-of-words model. And homogeneous Markov model showed promising results of multi-label sentence emotion recognition. This emotion recognition system is able to provide a convenient way to acquire valuable emotion information and to improve enterprise competitive ability in many aspects.
van Bokhorst, Lindsey G; Knapová, Lenka; Majoranc, Kim; Szebeni, Zea K; Táborský, Adam; Tomić, Dragana; Cañadas, Elena
2016-01-01
In many sports, such as figure skating or gymnastics, the outcome of a performance does not rely exclusively on objective measurements, but on more subjective cues. Judges need high attentional capacities to process visual information and overcome fatigue. Also their emotion recognition abilities might have an effect in detecting errors and making a more accurate assessment. Moreover, the scoring given by judges could be also influenced by their level of expertise. This study aims to assess how rhythmic gymnastics judges' emotion recognition and attentional abilities influence accuracy of performance assessment. Data will be collected from rhythmic gymnastics judges and coaches at different international levels. This study will employ an online questionnaire consisting on an emotion recognition test and attentional test. Participants' task is to watch a set of videotaped rhythmic gymnastics performances and evaluate them on the artistic and execution components of performance. Their scoring will be compared with the official scores given at the competition the video was taken from to measure the accuracy of the participants' evaluations. The proposed research represents an interdisciplinary approach that integrates cognitive and sport psychology within experimental and applied contexts. The current study advances the theoretical understanding of how emotional and attentional aspects affect the evaluation of sport performance. The results will provide valuable evidence on the direction and strength of the relationship between the above-mentioned factors and the accuracy of sport performance evaluation. Importantly, practical implications might be drawn from this study. Intervention programs directed at improving the accuracy of judges could be created based on the understanding of how emotion recognition and attentional abilities are related to the accuracy of performance assessment.
van Bokhorst, Lindsey G.; Knapová, Lenka; Majoranc, Kim; Szebeni, Zea K.; Táborský, Adam; Tomić, Dragana; Cañadas, Elena
2016-01-01
In many sports, such as figure skating or gymnastics, the outcome of a performance does not rely exclusively on objective measurements, but on more subjective cues. Judges need high attentional capacities to process visual information and overcome fatigue. Also their emotion recognition abilities might have an effect in detecting errors and making a more accurate assessment. Moreover, the scoring given by judges could be also influenced by their level of expertise. This study aims to assess how rhythmic gymnastics judges’ emotion recognition and attentional abilities influence accuracy of performance assessment. Data will be collected from rhythmic gymnastics judges and coaches at different international levels. This study will employ an online questionnaire consisting on an emotion recognition test and attentional test. Participants’ task is to watch a set of videotaped rhythmic gymnastics performances and evaluate them on the artistic and execution components of performance. Their scoring will be compared with the official scores given at the competition the video was taken from to measure the accuracy of the participants’ evaluations. The proposed research represents an interdisciplinary approach that integrates cognitive and sport psychology within experimental and applied contexts. The current study advances the theoretical understanding of how emotional and attentional aspects affect the evaluation of sport performance. The results will provide valuable evidence on the direction and strength of the relationship between the above-mentioned factors and the accuracy of sport performance evaluation. Importantly, practical implications might be drawn from this study. Intervention programs directed at improving the accuracy of judges could be created based on the understanding of how emotion recognition and attentional abilities are related to the accuracy of performance assessment. PMID:27458406
Wang, Bo
2013-01-01
Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.
Arousal Rather than Basic Emotions Influence Long-Term Recognition Memory in Humans
Marchewka, Artur; Wypych, Marek; Moslehi, Abnoos; Riegel, Monika; Michałowski, Jarosław M.; Jednoróg, Katarzyna
2016-01-01
Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After 6 months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was greater for pictures with lower intensity of happiness and with higher intensity of surprise, sadness, fear, and disgust. Consecutive fMRI analyses showed a significant activation for remembered (recognized) vs. forgotten (not recognized) images in anterior cingulate and bilateral anterior insula as well as in bilateral caudate nuclei and right thalamus. Further, arousal was found to be the only subjective rating significantly modulating brain activation. Higher subjective arousal evoked higher activation associated with memory recognition in the right caudate and the left cingulate gyrus. Notably, no significant modulation was observed for other subjective ratings, including basic emotion intensities. These results emphasize the crucial role of arousal for long-term recognition memory and support the hypothesis that the memorized material, over time, becomes stored in a distributed cortical network including the core salience network and basal ganglia. PMID:27818626
Arousal Rather than Basic Emotions Influence Long-Term Recognition Memory in Humans.
Marchewka, Artur; Wypych, Marek; Moslehi, Abnoos; Riegel, Monika; Michałowski, Jarosław M; Jednoróg, Katarzyna
2016-01-01
Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After 6 months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was greater for pictures with lower intensity of happiness and with higher intensity of surprise, sadness, fear, and disgust. Consecutive fMRI analyses showed a significant activation for remembered (recognized) vs. forgotten (not recognized) images in anterior cingulate and bilateral anterior insula as well as in bilateral caudate nuclei and right thalamus. Further, arousal was found to be the only subjective rating significantly modulating brain activation. Higher subjective arousal evoked higher activation associated with memory recognition in the right caudate and the left cingulate gyrus. Notably, no significant modulation was observed for other subjective ratings, including basic emotion intensities. These results emphasize the crucial role of arousal for long-term recognition memory and support the hypothesis that the memorized material, over time, becomes stored in a distributed cortical network including the core salience network and basal ganglia.
Emotionally enhanced memory for negatively arousing words: storage or retrieval advantage?
Nadarevic, Lena
2017-12-01
People typically remember emotionally negative words better than neutral words. Two experiments are reported that investigate whether emotionally enhanced memory (EEM) for negatively arousing words is based on a storage or retrieval advantage. Participants studied non-word-word pairs that either involved negatively arousing or neutral target words. Memory for these target words was tested by means of a recognition test and a cued-recall test. Data were analysed with a multinomial model that allows the disentanglement of storage and retrieval processes in the present recognition-then-cued-recall paradigm. In both experiments the multinomial analyses revealed no storage differences between negatively arousing and neutral words but a clear retrieval advantage for negatively arousing words in the cued-recall test. These findings suggest that EEM for negatively arousing words is driven by associative processes.
Familial covariation of facial emotion recognition and IQ in schizophrenia.
Andric, Sanja; Maric, Nadja P; Mihaljevic, Marina; Mirjanic, Tijana; van Os, Jim
2016-12-30
Alterations in general intellectual ability and social cognition in schizophrenia are core features of the disorder, evident at the illness' onset and persistent throughout its course. However, previous studies examining cognitive alterations in siblings discordant for schizophrenia yielded inconsistent results. Present study aimed to investigate the nature of the association between facial emotion recognition and general IQ by applying genetically sensitive cross-trait cross-sibling design. Participants (total n=158; patients, unaffected siblings, controls) were assessed using the Benton Facial Recognition Test, the Degraded Facial Affect Recognition Task (DFAR) and the Wechsler Adult Intelligence Scale-III. Patients had lower IQ and altered facial emotion recognition in comparison to other groups. Healthy siblings and controls did not significantly differ in IQ and DFAR performance, but siblings exhibited intermediate angry facial expression recognition. Cross-trait within-subject analyses showed significant associations between overall DFAR performance and IQ in all participants. Within-trait cross-sibling analyses found significant associations between patients' and siblings' IQ and overall DFAR performance, suggesting their familial clustering. Finally, cross-trait cross-sibling analyses revealed familial covariation of facial emotion recognition and IQ in siblings discordant for schizophrenia, further indicating their familial etiology. Both traits are important phenotypes for genetic studies and potential early clinical markers of schizophrenia-spectrum disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Collaboration enhances later individual memory for emotional material.
Bärthel, Gwennis A; Wessel, Ineke; Huntjens, Rafaële J C; Verwoerd, Johan
2017-05-01
Research on collaborative remembering suggests that collaboration hampers group memory (i.e., collaborative inhibition), yet enhances later individual memory. Studies examining collaborative effects on memory for emotional stimuli are scarce, especially concerning later individual memory. In the present study, female undergraduates watched an emotional movie and recalled it either collaboratively (n = 60) or individually (n = 60), followed by an individual free recall test and a recognition test. We replicated the standard collaborative inhibition effect. Further, in line with the literature, the collaborative condition displayed better post-collaborative individual memory. More importantly, in post-collaborative free recall, the centrality of the information to the movie plot did not play an important role. Recognition rendered slightly different results. Although collaboration rendered more correct recognition for more central details, it did not enhance recognition of background details. Secondly, the collaborative and individual conditions did not differ with respect to overlap of unique correct items in free recall. Yet, during recognition former collaborators more unanimously endorsed correct answers, as well as errors. Finally, extraversion, neuroticism, social anxiety, and depressive symptoms did not moderate the influence of collaboration on memory. Implications for the fields of forensic and clinical psychology are discussed.
Rieffe, Carolien; Wiefferink, Carin H
2017-03-01
The capacity for emotion recognition and understanding is crucial for daily social functioning. We examined to what extent this capacity is impaired in young children with a Language Impairment (LI). In typical development, children learn to recognize emotions in faces and situations through social experiences and social learning. Children with LI have less access to these experiences and are therefore expected to fall behind their peers without LI. In this study, 89 preschool children with LI and 202 children without LI (mean age 3 years and 10 months in both groups) were tested on three indices for facial emotion recognition (discrimination, identification, and attribution in emotion evoking situations). Parents reported on their children's emotion vocabulary and ability to talk about their own emotions. Preschoolers with and without LI performed similarly on the non-verbal task for emotion discrimination. Children with LI fell behind their peers without LI on both other tasks for emotion recognition that involved labelling the four basic emotions (happy, sad, angry, fear). The outcomes of these two tasks were also related to children's level of emotion language. These outcomes emphasize the importance of 'emotion talk' at the youngest age possible for children with LI. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka
2014-01-01
Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.
Facial expression recognition and emotional regulation in narcolepsy with cataplexy.
Bayard, Sophie; Croisier Langenier, Muriel; Dauvilliers, Yves
2013-04-01
Cataplexy is pathognomonic of narcolepsy with cataplexy, and defined by a transient loss of muscle tone triggered by strong emotions. Recent researches suggest abnormal amygdala function in narcolepsy with cataplexy. Emotion treatment and emotional regulation strategies are complex functions involving cortical and limbic structures, like the amygdala. As the amygdala has been shown to play a role in facial emotion recognition, we tested the hypothesis that patients with narcolepsy with cataplexy would have impaired recognition of facial emotional expressions compared with patients affected with central hypersomnia without cataplexy and healthy controls. We also aimed to determine whether cataplexy modulates emotional regulation strategies. Emotional intensity, arousal and valence ratings on Ekman faces displaying happiness, surprise, fear, anger, disgust, sadness and neutral expressions of 21 drug-free patients with narcolepsy with cataplexy were compared with 23 drug-free sex-, age- and intellectual level-matched adult patients with hypersomnia without cataplexy and 21 healthy controls. All participants underwent polysomnography recording and multiple sleep latency tests, and completed depression, anxiety and emotional regulation questionnaires. Performance of patients with narcolepsy with cataplexy did not differ from patients with hypersomnia without cataplexy or healthy controls on both intensity rating of each emotion on its prototypical label and mean ratings for valence and arousal. Moreover, patients with narcolepsy with cataplexy did not use different emotional regulation strategies. The level of depressive and anxious symptoms in narcolepsy with cataplexy did not differ from the other groups. Our results demonstrate that narcolepsy with cataplexy accurately perceives and discriminates facial emotions, and regulates emotions normally. The absence of alteration of perceived affective valence remains a major clinical interest in narcolepsy with cataplexy, and it supports the argument for optimal behaviour and social functioning in narcolepsy with cataplexy. © 2012 European Sleep Research Society.
Functional architecture of visual emotion recognition ability: A latent variable approach.
Lewis, Gary J; Lefevre, Carmen E; Young, Andrew W
2016-05-01
Emotion recognition has been a focus of considerable attention for several decades. However, despite this interest, the underlying structure of individual differences in emotion recognition ability has been largely overlooked and thus is poorly understood. For example, limited knowledge exists concerning whether recognition ability for one emotion (e.g., disgust) generalizes to other emotions (e.g., anger, fear). Furthermore, it is unclear whether emotion recognition ability generalizes across modalities, such that those who are good at recognizing emotions from the face, for example, are also good at identifying emotions from nonfacial cues (such as cues conveyed via the body). The primary goal of the current set of studies was to address these questions through establishing the structure of individual differences in visual emotion recognition ability. In three independent samples (Study 1: n = 640; Study 2: n = 389; Study 3: n = 303), we observed that the ability to recognize visually presented emotions is based on different sources of variation: a supramodal emotion-general factor, supramodal emotion-specific factors, and face- and within-modality emotion-specific factors. In addition, we found evidence that general intelligence and alexithymia were associated with supramodal emotion recognition ability. Autism-like traits, empathic concern, and alexithymia were independently associated with face-specific emotion recognition ability. These results (a) provide a platform for further individual differences research on emotion recognition ability, (b) indicate that differentiating levels within the architecture of emotion recognition ability is of high importance, and (c) show that the capacity to understand expressions of emotion in others is linked to broader affective and cognitive processes. (c) 2016 APA, all rights reserved).
Sharp, Carla; Vanwoerden, Salome; Van Baardewijk, Y; Tackett, J L; Stegge, H
2015-06-01
The aims of the current study were to show that the affective component of psychopathy (callous-unemotional traits) is related to deficits in recognizing emotions over and above other psychopathy dimensions and to show that this relationship is driven by a specific deficit in recognizing complex emotions more so than basic emotions. The authors administered the Child Eyes Test to assess emotion recognition in a community sample of preadolescent children between the ages of 10 and 12 (N = 417; 53.6% boys). The task required children to identify a broad array of emotions from photographic stimuli depicting the eye region of the face. Stimuli were then divided into complex or basic emotions. Results demonstrated a unique association between callous-unemotional traits and complex emotions, with weaker associations with basic emotion recognition, over and above other dimensions of psychopathy.
Emotional memory: No source memory without old-new recognition.
Bell, Raoul; Mieth, Laura; Buchner, Axel
2017-02-01
Findings reported in the memory literature suggest that the emotional components of an encoding episode can be dissociated from nonemotional memory. In particular, it has been found that the previous association with threatening events can be retrieved in aversive conditioning even in the absence of item identification. In the present study, we test whether emotional source memory can be independent of item recognition. Participants saw pictures of snakes paired with threatening and nonthreatening context information (poisonousness or nonpoisonousness). In the source memory test, participants were required to remember whether a snake was associated with poisonousness or nonpoisonousness. A simple extension of a well-established multinomial source monitoring model was used to measure source memory for unrecognized items. By using this model, it was possible to assess directly whether participants were able to associate a previously seen snake with poisonousness or nonpoisonousness even if the snake itself was not recognized as having been presented during the experiment. In 3 experiments, emotional source memory was only found for recognized items. While source memory for recognized items differed between emotional and nonemotional information, source memory for unrecognized items was equally absent for emotional and nonemotional information. We conclude that emotional context information is bound to item representations and cannot be retrieved in the absence of item recognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L
2015-04-01
Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2) = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.
Luo, Xin; Fu, Qian-Jie; Galvin, John J.
2007-01-01
The present study investigated the ability of normal-hearing listeners and cochlear implant users to recognize vocal emotions. Sentences were produced by 1 male and 1 female talker according to 5 target emotions: angry, anxious, happy, sad, and neutral. Overall amplitude differences between the stimuli were either preserved or normalized. In experiment 1, vocal emotion recognition was measured in normal-hearing and cochlear implant listeners; cochlear implant subjects were tested using their clinically assigned processors. When overall amplitude cues were preserved, normal-hearing listeners achieved near-perfect performance, whereas listeners with cochlear implant recognized less than half of the target emotions. Removing the overall amplitude cues significantly worsened mean normal-hearing and cochlear implant performance. In experiment 2, vocal emotion recognition was measured in listeners with cochlear implant as a function of the number of channels (from 1 to 8) and envelope filter cutoff frequency (50 vs 400 Hz) in experimental speech processors. In experiment 3, vocal emotion recognition was measured in normal-hearing listeners as a function of the number of channels (from 1 to 16) and envelope filter cutoff frequency (50 vs 500 Hz) in acoustic cochlear implant simulations. Results from experiments 2 and 3 showed that both cochlear implant and normal-hearing performance significantly improved as the number of channels or the envelope filter cutoff frequency was increased. The results suggest that spectral, temporal, and overall amplitude cues each contribute to vocal emotion recognition. The poorer cochlear implant performance is most likely attributable to the lack of salient pitch cues and the limited functional spectral resolution. PMID:18003871
Cerami, Chiara; Dodich, Alessandra; Iannaccone, Sandro; Marcone, Alessandra; Lettieri, Giada; Crespi, Chiara; Gianolli, Luigi; Cappa, Stefano F.; Perani, Daniela
2015-01-01
The behavioural variant of frontotemporal dementia (bvFTD) is a rare disease mainly affecting the social brain. FDG-PET fronto-temporal hypometabolism is a supportive feature for the diagnosis. It may also provide specific functional metabolic signatures for altered socio-emotional processing. In this study, we evaluated the emotion recognition and attribution deficits and FDG-PET cerebral metabolic patterns at the group and individual levels in a sample of sporadic bvFTD patients, exploring the cognitive-functional correlations. Seventeen probable mild bvFTD patients (10 male and 7 female; age 67.8±9.9) were administered standardized and validated version of social cognition tasks assessing the recognition of basic emotions and the attribution of emotions and intentions (i.e., Ekman 60-Faces test-Ek60F and Story-based Empathy task-SET). FDG-PET was analysed using an optimized voxel-based SPM method at the single-subject and group levels. Severe deficits of emotion recognition and processing characterized the bvFTD condition. At the group level, metabolic dysfunction in the right amygdala, temporal pole, and middle cingulate cortex was highly correlated to the emotional recognition and attribution performances. At the single-subject level, however, heterogeneous impairments of social cognition tasks emerged, and different metabolic patterns, involving limbic structures and prefrontal cortices, were also observed. The derangement of a right limbic network is associated with altered socio-emotional processing in bvFTD patients, but different hypometabolic FDG-PET patterns and heterogeneous performances on social tasks at an individual level exist. PMID:26513651
Deficits in Facial Emotion Recognition in Schizophrenia: A Replication Study with Korean Subjects
Lee, Seung Jae; Lee, Hae-Kook; Kweon, Yong-Sil; Lee, Chung Tai
2010-01-01
Objective We investigated the deficit in the recognition of facial emotions in a sample of medicated, stable Korean patients with schizophrenia using Korean facial emotion pictures and examined whether the possible impairments would corroborate previous findings. Methods Fifty-five patients with schizophrenia and 62 healthy control subjects completed the Facial Affect Identification Test with a new set of 44 colored photographs of Korean faces including the six universal emotions as well as neutral faces. Results Korean patients with schizophrenia showed impairments in the recognition of sad, fearful, and angry faces [F(1,114)=6.26, p=0.014; F(1,114)=6.18, p=0.014; F(1,114)=9.28, p=0.003, respectively], but their accuracy was no different from that of controls in the recognition of happy emotions. Higher total and three subscale scores of the Positive and Negative Syndrome Scale (PANSS) correlated with worse performance on both angry and neutral faces. Correct responses on happy stimuli were negatively correlated with negative symptom scores of the PANSS. Patients with schizophrenia also exhibited different patterns of misidentification relative to normal controls. Conclusion These findings were consistent with previous studies carried out with different ethnic groups, suggesting cross-cultural similarities in facial recognition impairment in schizophrenia. PMID:21253414
Multimedia Content Development as a Facial Expression Datasets for Recognition of Human Emotions
NASA Astrophysics Data System (ADS)
Mamonto, N. E.; Maulana, H.; Liliana, D. Y.; Basaruddin, T.
2018-02-01
Datasets that have been developed before contain facial expression from foreign people. The development of multimedia content aims to answer the problems experienced by the research team and other researchers who will conduct similar research. The method used in the development of multimedia content as facial expression datasets for human emotion recognition is the Villamil-Molina version of the multimedia development method. Multimedia content developed with 10 subjects or talents with each talent performing 3 shots with each capturing talent having to demonstrate 19 facial expressions. After the process of editing and rendering, tests are carried out with the conclusion that the multimedia content can be used as a facial expression dataset for recognition of human emotions.
Martinelli, Eugenio; Mencattini, Arianna; Di Natale, Corrado
2016-01-01
Humans can communicate their emotions by modulating facial expressions or the tone of their voice. Albeit numerous applications exist that enable machines to read facial emotions and recognize the content of verbal messages, methods for speech emotion recognition are still in their infancy. Yet, fast and reliable applications for emotion recognition are the obvious advancement of present ‘intelligent personal assistants’, and may have countless applications in diagnostics, rehabilitation and research. Taking inspiration from the dynamics of human group decision-making, we devised a novel speech emotion recognition system that applies, for the first time, a semi-supervised prediction model based on consensus. Three tests were carried out to compare this algorithm with traditional approaches. Labeling performances relative to a public database of spontaneous speeches are reported. The novel system appears to be fast, robust and less computationally demanding than traditional methods, allowing for easier implementation in portable voice-analyzers (as used in rehabilitation, research, industry, etc.) and for applications in the research domain (such as real-time pairing of stimuli to participants’ emotional state, selective/differential data collection based on emotional content, etc.). PMID:27563724
Kanakam, Natalie; Krug, Isabel; Raoult, Charlotte; Collier, David; Treasure, Janet
2013-07-01
Emotional processing difficulties are potential risk markers for eating disorders that are also present after recovery. The aim of this study was to examine these traits in twins with eating disorders. The Reading the Mind in the Eyes test, Emotional Stroop task and the Difficulties in Emotion Regulation Scale were administered to 112 twins with and without eating disorders (DSM IV-TR eating disorder criteria). Generalised estimating equations compared twins with eating disorders against unaffected co-twins and control twins, and within-pair correlations were calculated for clinical monozygotic (n = 50) and dizygotic twins (n = 20). Emotion recognition difficulties, attentional biases to social threat and difficulties in emotion regulation were greater in twins with eating disorders, and some were present in their unaffected twin siblings. Evidence for a possible genetic basis was highest for emotion recognition and attentional biases to social stimuli. Emotion recognition difficulties and sensitivity to social threat appear to be endophenotypes associated with eating disorders. However, the limited statistical power means that these findings are tentative and require further replication. Copyright © 2013 John Wiley & Sons, Ltd and Eating Disorders Association.
Repetition and brain potentials when recognizing natural scenes: task and emotion differences
Bradley, Margaret M.; Codispoti, Maurizio; Karlsson, Marie; Lang, Peter J.
2013-01-01
Repetition has long been known to facilitate memory performance, but its effects on event-related potentials (ERPs), measured as an index of recognition memory, are less well characterized. In Experiment 1, effects of both massed and distributed repetition on old–new ERPs were assessed during an immediate recognition test that followed incidental encoding of natural scenes that also varied in emotionality. Distributed repetition at encoding enhanced both memory performance and the amplitude of an old–new ERP difference over centro-parietal sensors. To assess whether these repetition effects reflect encoding or retrieval differences, the recognition task was replaced with passive viewing of old and new pictures in Experiment 2. In the absence of an explicit recognition task, ERPs were completely unaffected by repetition at encoding, and only emotional pictures prompted a modestly enhanced old–new difference. Taken together, the data suggest that repetition facilitates retrieval processes and that, in the absence of an explicit recognition task, differences in old–new ERPs are only apparent for affective cues. PMID:22842817
Recognition memory for emotional and neutral faces: an event-related potential study.
Johansson, Mikael; Mecklinger, Axel; Treese, Anne-Cécile
2004-12-01
This study examined emotional influences on the hypothesized event-related potential (ERP) correlates of familiarity and recollection (Experiment 1) and the states of awareness (Experiment 2) accompanying recognition memory for faces differing in facial affect. Participants made gender judgments to positive, negative, and neutral faces at study and were in the test phase instructed to discriminate between studied and nonstudied faces. Whereas old-new discrimination was unaffected by facial expression, negative faces were recollected to a greater extent than both positive and neutral faces as reflected in the parietal ERP old-new effect and in the proportion of remember judgments. Moreover, emotion-specific modulations were observed in frontally recorded ERPs elicited by correctly rejected new faces that concurred with a more liberal response criterion for emotional as compared to neutral faces. Taken together, the results are consistent with the view that processes promoting recollection are facilitated for negative events and that emotion may affect recognition performance by influencing criterion setting mediated by the prefrontal cortex.
Social Learning Modulates the Lateralization of Emotional Valence
ERIC Educational Resources Information Center
Shamay-Tsoory, Simone G.; Lavidor, Michal; Aharon-Peretz, Judith
2008-01-01
Although neuropsychological studies of lateralization of emotion have emphasized valence (positive vs. negative) or type (basic vs. complex) dimensions, the interaction between the two dimensions has yet to be elucidated. The purpose of the current study was to test the hypothesis that recognition of basic emotions is processed preferentially by…
Barbato, Mariapaola; Liu, Lu; Cadenhead, Kristin S; Cannon, Tyrone D; Cornblatt, Barbara A; McGlashan, Thomas H; Perkins, Diana O; Seidman, Larry J; Tsuang, Ming T; Walker, Elaine F; Woods, Scott W; Bearden, Carrie E; Mathalon, Daniel H; Heinssen, Robert; Addington, Jean
2015-09-01
Social cognition, the mental operations that underlie social interactions, is a major construct to investigate in schizophrenia. Impairments in social cognition are present before the onset of psychosis, and even in unaffected first-degree relatives, suggesting that social cognition may be a trait marker of the illness. In a large cohort of individuals at clinical high risk for psychosis (CHR) and healthy controls, three domains of social cognition (theory of mind, facial emotion recognition and social perception) were assessed to clarify which domains are impaired in this population. Six-hundred and seventy-five CHR individuals and 264 controls, who were part of the multi-site North American Prodromal Longitudinal Study, completed The Awareness of Social Inference Test , the Penn Emotion Recognition task , the Penn Emotion Differentiation task , and the Relationship Across Domains , measures of theory of mind, facial emotion recognition, and social perception, respectively. Social cognition was not related to positive and negative symptom severity, but was associated with age and IQ. CHR individuals demonstrated poorer performance on all measures of social cognition. However, after controlling for age and IQ, the group differences remained significant for measures of theory of mind and social perception, but not for facial emotion recognition. Theory of mind and social perception are impaired in individuals at CHR for psychosis. Age and IQ seem to play an important role in the arising of deficits in facial affect recognition. Future studies should examine the stability of social cognition deficits over time and their role, if any, in the development of psychosis.
Can emotion recognition be taught to children with autism spectrum conditions?
Baron-Cohen, Simon; Golan, Ofer; Ashwin, Emma
2009-01-01
Children with autism spectrum conditions (ASC) have major difficulties in recognizing and responding to emotional and mental states in others' facial expressions. Such difficulties in empathy underlie their social-communication difficulties that form a core of the diagnosis. In this paper we ask whether aspects of empathy can be taught to young children with ASC. We review a study that evaluated The Transporters, an animated series designed to enhance emotion comprehension in children with ASC. Children with ASC (4–7 years old) watched The Transporters every day for four weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three levels of generalization. The intervention group improved significantly more than a clinical control group on all task levels, performing comparably to typical controls at time 2. The discussion centres on how vehicles as mechanical systems may be one key reason why The Transporters caused the improved understanding and recognition of emotions in children with ASC. The implications for the design of autism-friendly interventions are also explored. PMID:19884151
Static facial expression recognition with convolution neural networks
NASA Astrophysics Data System (ADS)
Zhang, Feng; Chen, Zhong; Ouyang, Chao; Zhang, Yifei
2018-03-01
Facial expression recognition is a currently active research topic in the fields of computer vision, pattern recognition and artificial intelligence. In this paper, we have developed a convolutional neural networks (CNN) for classifying human emotions from static facial expression into one of the seven facial emotion categories. We pre-train our CNN model on the combined FER2013 dataset formed by train, validation and test set and fine-tune on the extended Cohn-Kanade database. In order to reduce the overfitting of the models, we utilized different techniques including dropout and batch normalization in addition to data augmentation. According to the experimental result, our CNN model has excellent classification performance and robustness for facial expression recognition.
Social cognition in schizophrenia and healthy aging: differences and similarities.
Silver, Henry; Bilker, Warren B
2014-12-01
Social cognition is impaired in schizophrenia but it is not clear whether this is specific for the illness and whether emotion perception is selectively affected. To study this we examined the perception of emotional and non-emotional clues in facial expressions, a key social cognitive skill, in schizophrenia patients and old healthy individuals using young healthy individuals as reference. Tests of object recognition, visual orientation, psychomotor speed, and working memory were included to allow multivariate analysis taking into account other cognitive functions Schizophrenia patients showed impairments in recognition of identity and emotional facial clues compared to young and old healthy groups. Severity was similar to that for object recognition and visuospatial processing. Older and younger healthy groups did not differ from each other on these tests. Schizophrenia patients and old healthy individuals were similarly impaired in the ability to automatically learn new faces during the testing procedure (measured by the CSTFAC index) compared to young healthy individuals. Social cognition is distinctly impaired in schizophrenia compared to healthy aging. Further study is needed to identify the mechanisms of automatic social cognitive learning impairment in schizophrenia patients and healthy aging individuals and determine whether similar neural systems are affected. Copyright © 2014 Elsevier B.V. All rights reserved.
Sully, K; Sonuga-Barke, E J S; Fairchild, G
2015-07-01
There is accumulating evidence of impairments in facial emotion recognition in adolescents with conduct disorder (CD). However, the majority of studies in this area have only been able to demonstrate an association, rather than a causal link, between emotion recognition deficits and CD. To move closer towards understanding the causal pathways linking emotion recognition problems with CD, we studied emotion recognition in the unaffected first-degree relatives of CD probands, as well as those with a diagnosis of CD. Using a family-based design, we investigated facial emotion recognition in probands with CD (n = 43), their unaffected relatives (n = 21), and healthy controls (n = 38). We used the Emotion Hexagon task, an alternative forced-choice task using morphed facial expressions depicting the six primary emotions, to assess facial emotion recognition accuracy. Relative to controls, the CD group showed impaired recognition of anger, fear, happiness, sadness and surprise (all p < 0.005). Similar to probands with CD, unaffected relatives showed deficits in anger and happiness recognition relative to controls (all p < 0.008), with a trend toward a deficit in fear recognition. There were no significant differences in performance between the CD probands and the unaffected relatives following correction for multiple comparisons. These results suggest that facial emotion recognition deficits are present in adolescents who are at increased familial risk for developing antisocial behaviour, as well as those who have already developed CD. Consequently, impaired emotion recognition appears to be a viable familial risk marker or candidate endophenotype for CD.
Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.
Chung, Joanne M; Robins, Richard W
2015-01-01
Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.
Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions
Chung, Joanne M.; Robins, Richard W.
2015-01-01
Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values. PMID:26309215
Ventromedial prefrontal cortex mediates visual attention during facial emotion recognition.
Wolf, Richard C; Philippi, Carissa L; Motzkin, Julian C; Baskaya, Mustafa K; Koenigs, Michael
2014-06-01
The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ferrucci, Roberta; Giannicola, Gaia; Rosa, Manuela; Fumagalli, Manuela; Boggio, Paulo Sergio; Hallett, Mark; Zago, Stefano; Priori, Alberto
2012-01-01
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.
Recognition of facial emotions in neuropsychiatric disorders.
Kohler, Christian G; Turner, Travis H; Gur, Raquel E; Gur, Ruben C
2004-04-01
Recognition of facial emotions represents an important aspect of interpersonal communication and is governed by select neural substrates. We present data on emotion recognition in healthy young adults utilizing a novel set of color photographs of evoked universal emotions. In addition, we review the recent literature on emotion recognition in psychiatric and neurologic disorders, and studies that compare different disorders.
Emotion Recognition From Singing Voices Using Contemporary Commercial Music and Classical Styles.
Hakanpää, Tua; Waaramaa, Teija; Laukkanen, Anne-Maria
2018-02-22
This study examines the recognition of emotion in contemporary commercial music (CCM) and classical styles of singing. This information may be useful in improving the training of interpretation in singing. This is an experimental comparative study. Thirteen singers (11 female, 2 male) with a minimum of 3 years' professional-level singing studies (in CCM or classical technique or both) participated. They sang at three pitches (females: a, e1, a1, males: one octave lower) expressing anger, sadness, joy, tenderness, and a neutral state. Twenty-nine listeners listened to 312 short (0.63- to 4.8-second) voice samples, 135 of which were sung using a classical singing technique and 165 of which were sung in a CCM style. The listeners were asked which emotion they heard. Activity and valence were derived from the chosen emotions. The percentage of correct recognitions out of all the answers in the listening test (N = 9048) was 30.2%. The recognition percentage for the CCM-style singing technique was higher (34.5%) than for the classical-style technique (24.5%). Valence and activation were better perceived than the emotions themselves, and activity was better recognized than valence. A higher pitch was more likely to be perceived as joy or anger, and a lower pitch as sorrow. Both valence and activation were better recognized in the female CCM samples than in the other samples. There are statistically significant differences in the recognition of emotions between classical and CCM styles of singing. Furthermore, in the singing voice, pitch affects the perception of emotions, and valence and activity are more easily recognized than emotions. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Kensinger, Elizabeth A; Choi, Hae-Yoon; Murray, Brendan D; Rajaram, Suparna
2016-07-01
In daily life, emotional events are often discussed with others. The influence of these social interactions on the veracity of emotional memories has rarely been investigated. The authors (Choi, Kensinger, & Rajaram Memory and Cognition, 41, 403-415, 2013) previously demonstrated that when the categorical relatedness of information is controlled, emotional items are more accurately remembered than neutral items. The present study examined whether emotion would continue to improve the accuracy of memory when individuals discussed the emotional and neutral events with others. Two different paradigms involving social influences were used to investigate this question and compare evidence. In both paradigms, participants studied stimuli that were grouped into conceptual categories of positive (e.g., celebration), negative (e.g., funeral), or neutral (e.g., astronomy) valence. After a 48-hour delay, recognition memory was tested for studied items and categorically related lures. In the first paradigm, recognition accuracy was compared when memory was tested individually or in a collaborative triad. In the second paradigm, recognition accuracy was compared when a prior retrieval session had occurred individually or with a confederate who supplied categorically related lures. In both of these paradigms, emotional stimuli were remembered more accurately than were neutral stimuli, and this pattern was preserved when social interaction occurred. In fact, in the first paradigm, there was a trend for collaboration to increase the beneficial effect of emotion on memory accuracy, and in the second paradigm, emotional lures were significantly less susceptible to the "social contagion" effect. Together, these results demonstrate that emotional memories can be more accurate than nonemotional ones even when events are discussed with others (Experiment 1) and even when that discussion introduces misinformation (Experiment 2).
Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James
2017-01-01
Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives.
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque
2018-01-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human–Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam. PMID:29389845
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality.
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque; Javaid, Ahmad Y
2018-02-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.
Effects of emotional context on memory for details: the role of attention.
Kim, Johann Sung-Cheul; Vossel, Gerhard; Gamer, Matthias
2013-01-01
It was repeatedly demonstrated that a negative emotional context enhances memory for central details while impairing memory for peripheral information. This trade-off effect is assumed to result from attentional processes: a negative context seems to narrow attention to central information at the expense of more peripheral details, thus causing the differential effects in memory. However, this explanation has rarely been tested and previous findings were partly inconclusive. For the present experiment 13 negative and 13 neutral naturalistic, thematically driven picture stories were constructed to test the trade-off effect in an ecologically more valid setting as compared to previous studies. During an incidental encoding phase, eye movements were recorded as an index of overt attention. In a subsequent recognition phase, memory for central and peripheral details occurring in the picture stories was tested. Explicit affective ratings and autonomic responses validated the induction of emotion during encoding. Consistent with the emotional trade-off effect on memory, encoding context differentially affected recognition of central and peripheral details. However, contrary to the common assumption, the emotional trade-off effect on memory was not mediated by attentional processes. By contrast, results suggest that the relevance of attentional processing for later recognition memory depends on the centrality of information and the emotional context but not their interaction. Thus, central information was remembered well even when fixated very briefly whereas memory for peripheral information depended more on overt attention at encoding. Moreover, the influence of overt attention on memory for central and peripheral details seems to be much lower for an arousing as compared to a neutral context.
Effects of Emotional Context on Memory for Details: The Role of Attention
Kim, Johann Sung-Cheul; Vossel, Gerhard; Gamer, Matthias
2013-01-01
It was repeatedly demonstrated that a negative emotional context enhances memory for central details while impairing memory for peripheral information. This trade-off effect is assumed to result from attentional processes: a negative context seems to narrow attention to central information at the expense of more peripheral details, thus causing the differential effects in memory. However, this explanation has rarely been tested and previous findings were partly inconclusive. For the present experiment 13 negative and 13 neutral naturalistic, thematically driven picture stories were constructed to test the trade-off effect in an ecologically more valid setting as compared to previous studies. During an incidental encoding phase, eye movements were recorded as an index of overt attention. In a subsequent recognition phase, memory for central and peripheral details occurring in the picture stories was tested. Explicit affective ratings and autonomic responses validated the induction of emotion during encoding. Consistent with the emotional trade-off effect on memory, encoding context differentially affected recognition of central and peripheral details. However, contrary to the common assumption, the emotional trade-off effect on memory was not mediated by attentional processes. By contrast, results suggest that the relevance of attentional processing for later recognition memory depends on the centrality of information and the emotional context but not their interaction. Thus, central information was remembered well even when fixated very briefly whereas memory for peripheral information depended more on overt attention at encoding. Moreover, the influence of overt attention on memory for central and peripheral details seems to be much lower for an arousing as compared to a neutral context. PMID:24116226
Nobiletin improves emotional and novelty recognition memory but not spatial referential memory.
Kang, Jiyun; Shin, Jung-Won; Kim, Yoo-Rim; Swanberg, Kelley M; Kim, Yooseung; Bae, Jae Ryong; Kim, Young Ki; Lee, Jinwon; Kim, Soo-Yeon; Sohn, Nak-Won; Maeng, Sungho
2017-01-01
How to maintain and enhance cognitive functions for both aged and young populations is a highly interesting subject. But candidate memory-enhancing reagents are tested almost exclusively on lesioned or aged animals. Also, there is insufficient information on the type of memory these reagents can improve. Working memory, located in the prefrontal cortex, manages short-term sensory information, but, by gaining significant relevance, this information is converted to long-term memory by hippocampal formation and/or amygdala, followed by tagging with space-time or emotional cues, respectively. Nobiletin is a product of citrus peel known for cognitive-enhancing effects in various pharmacological and neurodegenerative disease models, yet, it is not well studied in non-lesioned animals and the type of memory that nobiletin can improve remains unclear. In this study, 8-week-old male mice were tested using behavioral measurements for working, spatial referential, emotional and visual recognition memory after daily administration of nobiletin. While nobiletin did not induce any change of spontaneous activity in the open field test, freezing by fear conditioning and novel object recognition increased. However, the effectiveness of spatial navigation in the Y-maze and Morris water maze was not improved. These results mean that nobiletin can specifically improve memories of emotionally salient information associated with fear and novelty, but not of spatial information without emotional saliency. Accordingly, the use of nobiletin on normal subjects as a memory enhancer would be more effective on emotional types but may have limited value for the improvement of episodic memories.
Emotion Recognition Ability: A Multimethod-Multitrait Study.
ERIC Educational Resources Information Center
Gaines, Margie; And Others
A common paradigm in measuring the ability to recognize facial expressions of emotion is to present photographs of facial expressions and to ask subjects to identify the emotion. The Affect Blend Test (ABT) uses this method of assessment and is scored for accuracy on specific affects as well as total accuracy. Another method of measuring affect…
Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.
Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi
2012-12-01
We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Social Behavior and Impairments in Social Cognition Following Traumatic Brain Injury.
May, Michelle; Milders, Maarten; Downey, Bruce; Whyte, Maggie; Higgins, Vanessa; Wojcik, Zuzana; Amin, Sophie; O'Rourke, Suzanne
2017-05-01
The negative effect of changes in social behavior following traumatic brain injury (TBI) are known, but much less is known about the neuropsychological impairments that may underlie and predict these changes. The current study investigated possible associations between post-injury behavior and neuropsychological competencies of emotion recognition, understanding intentions, and response selection, that have been proposed as important for social functioning. Forty participants with TBI and 32 matched healthy participants completed a battery of tests assessing the three functions of interest. In addition, self- and proxy reports of pre- and post-injury behavior, mood, and community integration were collected. The TBI group performed significantly poorer than the comparison group on all tasks of emotion recognition, understanding intention, and on one task of response selection. Ratings of current behavior suggested significant changes in the TBI group relative to before the injury and showed significantly poorer community integration and interpersonal behavior than the comparison group. Of the three functions considered, emotion recognition was associated with both post-injury behavior and community integration and this association could not be fully explained by injury severity, time since injury, or education. The current study confirmed earlier findings of associations between emotion recognition and post-TBI behavior, providing partial evidence for models proposing emotion recognition as one of the pre-requisites for adequate social functioning. (JINS, 2017, 23, 400-411).
Mossaheb, Nilufar; Kaufmann, Rainer M; Schlögelhofer, Monika; Aninilkumparambil, Thushara; Himmelbauer, Claudia; Gold, Anna; Zehetmayer, Sonja; Hoffmann, Holger; Traue, Harald C; Aschauer, Harald
2018-01-01
Social interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls. Standardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smell identification [University of Pennsylvania Smell Identification Test (UPSIT)] in 51 patients with schizophrenia spectrum disorders and 79 healthy controls; furthermore, working memory functions and clinical variables were assessed. In both the univariate and the multivariate results, illness showed a significant influence on UPSIT and FEEL. The inclusion of age and working memory in the MANOVA resulted in a differential effect with sex and working memory as remaining significant factors. Duration of illness was correlated with both emotion recognition and smell identification in men only, whereas immediate general psychopathology and negative symptoms were associated with emotion recognition only in women. Being affected by schizophrenia spectrum disorder impacts one's ability to correctly recognize facial affects and identify odors. Converging evidence suggests a link between the investigated basic and social cognitive abilities in patients with schizophrenia spectrum disorders with a strong contribution of working memory and differential effects of modulators in women vs. men.
Age-related differences in emotion recognition ability: a cross-sectional study.
Mill, Aire; Allik, Jüri; Realo, Anu; Valk, Raivo
2009-10-01
Experimental studies indicate that recognition of emotions, particularly negative emotions, decreases with age. However, there is no consensus at which age the decrease in emotion recognition begins, how selective this is to negative emotions, and whether this applies to both facial and vocal expression. In the current cross-sectional study, 607 participants ranging in age from 18 to 84 years (mean age = 32.6 +/- 14.9 years) were asked to recognize emotions expressed either facially or vocally. In general, older participants were found to be less accurate at recognizing emotions, with the most distinctive age difference pertaining to a certain group of negative emotions. Both modalities revealed an age-related decline in the recognition of sadness and -- to a lesser degree -- anger, starting at about 30 years of age. Although age-related differences in the recognition of expression of emotion were not mediated by personality traits, 2 of the Big 5 traits, openness and conscientiousness, made an independent contribution to emotion-recognition performance. Implications of age-related differences in facial and vocal emotion expression and early onset of the selective decrease in emotion recognition are discussed in terms of previous findings and relevant theoretical models.
Reyes, B Nicole; Segal, Shira C; Moulson, Margaret C
2018-01-01
Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.
An investigation of the effect of race-based social categorization on adults’ recognition of emotion
Reyes, B. Nicole; Segal, Shira C.
2018-01-01
Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one’s own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition. PMID:29474367
Dopamine and light: effects on facial emotion recognition.
Cawley, Elizabeth; Tippler, Maria; Coupland, Nicholas J; Benkelfat, Chawki; Boivin, Diane B; Aan Het Rot, Marije; Leyton, Marco
2017-09-01
Bright light can affect mood states and social behaviours. Here, we tested potential interacting effects of light and dopamine on facial emotion recognition. Participants were 32 women with subsyndromal seasonal affective disorder tested in either a bright (3000 lux) or dim light (10 lux) environment. Each participant completed two test days, one following the ingestion of a phenylalanine/tyrosine-deficient mixture and one with a nutritionally balanced control mixture, both administered double blind in a randomised order. Approximately four hours post-ingestion participants completed a self-report measure of mood followed by a facial emotion recognition task. All testing took place between November and March when seasonal symptoms would be present. Following acute phenylalanine/tyrosine depletion (APTD), compared to the nutritionally balanced control mixture, participants in the dim light condition were more accurate at recognising sad faces, less likely to misclassify them, and faster at responding to them, effects that were independent of changes in mood. Effects of APTD on responses to sad faces in the bright light group were less consistent. There were no APTD effects on responses to other emotions, with one exception: a significant light × mixture interaction was seen for the reaction time to fear, but the pattern of effect was not predicted a priori or seen on other measures. Together, the results suggest that the processing of sad emotional stimuli might be greater when dopamine transmission is low. Bright light exposure, used for the treatment of both seasonal and non-seasonal mood disorders, might produce some of its benefits by preventing this effect.
Actively Paranoid Patients with Schizophrenia Over Attribute Anger to Neutral Faces
Pinkham, Amy E.; Brensinger, Colleen; Kohler, Christian; Gur, Raquel E.; Gur, Ruben C.
2010-01-01
Previous investigations of the influence of paranoia on facial affect recognition in schizophrenia have been inconclusive as some studies demonstrate better performance for paranoid relative to non-paranoid patients and others show that paranoid patients display greater impairments. These studies have been limited by small sample sizes and inconsistencies in the criteria used to define groups. Here, we utilized an established emotion recognition task and a large sample to examine differential performance in emotion recognition ability between patients who were actively paranoid (AP) and those who were not actively paranoid (NAP). Accuracy and error patterns on the Penn Emotion Recognition test (ER40) were examined in 132 patients (64 NAP and 68 AP). Groups were defined based on the presence of paranoid ideation at the time of testing rather than diagnostic subtype. AP and NAP patients did not differ in overall task accuracy; however, an emotion by group interaction indicated that AP patients were significantly worse than NAP patients at correctly labeling neutral faces. A comparison of error patterns on neutral stimuli revealed that the groups differed only in misattributions of anger expressions, with AP patients being significantly more likely to misidentify a neutral expression as angry. The present findings suggest that paranoia is associated with a tendency to over attribute threat to ambiguous stimuli and also lend support to emerging hypotheses of amygdala hyperactivation as a potential neural mechanism for paranoid ideation. PMID:21112186
Berzenski, Sara R; Yates, Tuppett M
2017-10-01
The ability to recognize and label emotions serves as a building block by which children make sense of the world and learn how to interact with social partners. However, the timing and salience of influences on emotion recognition development are not fully understood. Path analyses evaluated the contributions of parenting and child narrative coherence to the development of emotion recognition across ages 4 through 8 in a diverse (50% female; 46% Hispanic, 18.4% Black, 11.2% White, .4% Asian, 24.0% multiracial) longitudinally followed sample of 250 caregiver-child dyads. Parenting behaviors during interactions (i.e., support, instructional quality, intrusiveness, and hostility) and children's narrative coherence during the MacArthur Story Stem Battery were observed at ages 4 and 6. Emotion recognition increased from age 4 to 8. Parents' supportive presence at age 4 and instructional quality at age 6 predicted increased emotion recognition at 8, beyond initial levels of emotion recognition and child cognitive ability. There were no significant effects of negative parenting (i.e., intrusiveness or hostility) at 4 or 6 on emotion recognition. Child narrative coherence at ages 4 and 6 predicted increased emotion recognition at 8. Emotion recognition at age 4 predicted increased parent instructional quality and decreased intrusiveness at 6. These findings clarify whether and when familial and child factors influence emotion recognition development. Influences on emotion recognition development emerged as differentially salient across time periods, such that there is a need to develop and implement targeted interventions to promote positive parenting skills and children's narrative coherence at specific ages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
2013-01-01
Background We have developed a new paradigm that targets the recognition of facial expression of emotions. Here we report the protocol of a randomised controlled trial of the effects of emotion recognition training on mood in a sample of individuals with depressive symptoms over a 6-week follow-up period. Methods/Design We will recruit 190 adults from the general population who report high levels of depressive symptoms (defined as a score ≥ 14 on the Beck Depression Inventory-II). Participants will attend a screening session and will be randomised to intervention or control procedures, repeated five times over consecutive days (Monday to Friday). A follow-up session will take place at end-of -treatment, 2-weeks and 6-weeks after training. Our primary study outcome will be depressive symptoms, Beck Depression Inventory- II (rated over the past two weeks). Our secondary outcomes are: depressive symptoms, Hamilton Rating Scale for Depression; anxiety symptoms, Beck Anxiety Inventory (rated over the past month); positive affect, Positive and Negative Affect Schedule (rated as ‘how you feel right now’); negative affect, Positive and Negative Affect Schedule (rated as ‘how you feel right now’); emotion sensitivity, Emotion Recognition Task (test phase); approach motivation and persistence, the Fishing Game; and depressive interpretation bias, Scrambled Sentences Test. Discussion This study is of a novel cognitive bias modification technique that targets biases in emotional processing characteristic of depression, and can be delivered automatically via computer, Internet or Smartphone. It therefore has potential to be a valuable cost-effective adjunctive treatment for depression which may be used together with more traditional psychotherapy, cognitive-behavioural therapy and pharmacotherapy. Trial registration Current Controlled Trials: ISRCTN17767674 PMID:23725208
Adams, Sally; Penton-Voak, Ian S; Harmer, Catherine J; Holmes, Emily A; Munafò, Marcus R
2013-06-01
We have developed a new paradigm that targets the recognition of facial expression of emotions. Here we report the protocol of a randomised controlled trial of the effects of emotion recognition training on mood in a sample of individuals with depressive symptoms over a 6-week follow-up period. We will recruit 190 adults from the general population who report high levels of depressive symptoms (defined as a score ≥ 14 on the Beck Depression Inventory-II). Participants will attend a screening session and will be randomised to intervention or control procedures, repeated five times over consecutive days (Monday to Friday). A follow-up session will take place at end-of -treatment, 2-weeks and 6-weeks after training. Our primary study outcome will be depressive symptoms, Beck Depression Inventory- II (rated over the past two weeks). Our secondary outcomes are: depressive symptoms, Hamilton Rating Scale for Depression; anxiety symptoms, Beck Anxiety Inventory (rated over the past month); positive affect, Positive and Negative Affect Schedule (rated as 'how you feel right now'); negative affect, Positive and Negative Affect Schedule (rated as 'how you feel right now'); emotion sensitivity, Emotion Recognition Task (test phase); approach motivation and persistence, the Fishing Game; and depressive interpretation bias, Scrambled Sentences Test. This study is of a novel cognitive bias modification technique that targets biases in emotional processing characteristic of depression, and can be delivered automatically via computer, Internet or Smartphone. It therefore has potential to be a valuable cost-effective adjunctive treatment for depression which may be used together with more traditional psychotherapy, cognitive-behavioural therapy and pharmacotherapy. Current Controlled Trials: ISRCTN17767674.
Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W
2016-05-03
Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, P<0.05). There was no significant difference in performance accuracy or reaction time between active and placebo conditions. To the best of our knowledge, this study provides the first evidence suggesting that adjunctive raloxifene treatment changes neural activity in brain regions associated with facial emotion recognition in schizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.
Balconi, Michela; Vanutelli, Maria Elide; Finocchiaro, Roberta
2014-09-26
The paper explored emotion comprehension in children with regard to facial expression of emotion. The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored. Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses. Self-report measures (in terms of correct recognition, arousal and valence attribution) and psychophysiological correlates (facial electromyography, EMG, skin conductance response, SCR, and heart rate, HR) were observed when children (N = 26; mean age = 8.75 y; range 6-11 y) looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). The competencies about the recognition, the evaluation on valence and arousal was tested in concomitance with psychophysiological variations. Specifically, we tested for the congruence of these multiple measures. Log-linear analysis and repeated measure ANOVAs showed different representations across the subjects, as a function of emotion. Specifically, children' recognition and attribution were well developed for some emotions (such as anger, fear, surprise and happiness), whereas some other emotions (mainly disgust and sadness) were less clearly represented. SCR, HR and EMG measures were modulated by the evaluation based on valence and arousal, with increased psychophysiological values mainly in response to anger, fear and happiness. As shown by multiple regression analysis, a significant consonance was found between self-report measures and psychophysiological behavior, mainly for emotions rated as more arousing and negative in valence. The multilevel measures were discussed at light of dimensional attribution model.
2014-01-01
Background The paper explored emotion comprehension in children with regard to facial expression of emotion. The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored. Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses. Methods Self-report measures (in terms of correct recognition, arousal and valence attribution) and psychophysiological correlates (facial electromyography, EMG, skin conductance response, SCR, and heart rate, HR) were observed when children (N = 26; mean age = 8.75 y; range 6-11 y) looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). The competencies about the recognition, the evaluation on valence and arousal was tested in concomitance with psychophysiological variations. Specifically, we tested for the congruence of these multiple measures. Results Log-linear analysis and repeated measure ANOVAs showed different representations across the subjects, as a function of emotion. Specifically, children’ recognition and attribution were well developed for some emotions (such as anger, fear, surprise and happiness), whereas some other emotions (mainly disgust and sadness) were less clearly represented. SCR, HR and EMG measures were modulated by the evaluation based on valence and arousal, with increased psychophysiological values mainly in response to anger, fear and happiness. Conclusions As shown by multiple regression analysis, a significant consonance was found between self-report measures and psychophysiological behavior, mainly for emotions rated as more arousing and negative in valence. The multilevel measures were discussed at light of dimensional attribution model. PMID:25261242
Amlerova, Jana; Cavanna, Andrea E; Bradac, Ondrej; Javurkova, Alena; Raudenska, Jaroslava; Marusic, Petr
2014-07-01
The abilities to identify facial expression from another person's face and to attribute mental states to others refer to preserved function of the temporal lobes. In the present study, we set out to evaluate emotion recognition and social cognition in presurgical and postsurgical patients with unilateral refractory temporal lobe epilepsy (TLE). The aim of our study was to investigate the effects of TLE surgery and to identify the main risk factors for impairment in these functions. We recruited 30 patients with TLE for longitudinal data analysis (14 with right-sided and 16 with left-sided TLE) and 74 patients for cross-sectional data analysis (37 with right-sided and 37 with left-sided TLE) plus 20 healthy controls. Besides standard neuropsychological assessment, we administered an analog of the Ekman and Friesen test and the Faux Pas Test to assess emotion recognition and social cognition, respectively. Both emotion recognition and social cognition were impaired in the group of patients with TLE, irrespective of the focus side, compared with healthy controls. The performance in both tests was strongly dependent on the intelligence level. Beyond intelligence level, earlier age at epilepsy onset, longer disease duration, and history of early childhood brain injury predicted social cognition problems in patients with TLE. Epilepsy surgery within the temporal lobe seems to have neutral effect on patients' performances in both domains. However, there are a few individual patients who appear to be at risk of postoperative decline, even when seizure freedom is achieved following epilepsy surgery. Copyright © 2014 Elsevier Inc. All rights reserved.
Wolf, Richard C; Pujara, Maia; Baskaya, Mustafa K; Koenigs, Michael
2016-09-01
Facial emotion recognition is a critical aspect of human communication. Since abnormalities in facial emotion recognition are associated with social and affective impairment in a variety of psychiatric and neurological conditions, identifying the neural substrates and psychological processes underlying facial emotion recognition will help advance basic and translational research on social-affective function. Ventromedial prefrontal cortex (vmPFC) has recently been implicated in deploying visual attention to the eyes of emotional faces, although there is mixed evidence regarding the importance of this brain region for recognition accuracy. In the present study of neurological patients with vmPFC damage, we used an emotion recognition task with morphed facial expressions of varying intensities to determine (1) whether vmPFC is essential for emotion recognition accuracy, and (2) whether instructed attention to the eyes of faces would be sufficient to improve any accuracy deficits. We found that vmPFC lesion patients are impaired, relative to neurologically healthy adults, at recognizing moderate intensity expressions of anger and that recognition accuracy can be improved by providing instructions of where to fixate. These results suggest that vmPFC may be important for the recognition of facial emotion through a role in guiding visual attention to emotionally salient regions of faces. Copyright © 2016 Elsevier Ltd. All rights reserved.
Leist, Tatyana; Dadds, Mark R
2009-04-01
Emotional processing styles appear to characterize various forms of psychopathology and environmental adversity in children. For example, autistic, anxious, high- and low-emotion conduct problem children, and children who have been maltreated, all appear to show specific deficits and strengths in recognizing the facial expressions of emotions. Until now, the relationships between emotion recognition, antisocial behaviour, emotional problems, callous-unemotional (CU) traits and early maltreatment have never been assessed simultaneously in one study, and the specific associations of emotion recognition to maltreatment and child characteristics are therefore unknown. We examined facial-emotion processing in a sample of 23 adolescents selected for high-risk status on the variables of interest. As expected, maltreatment and child characteristics showed unique associations. CU traits were uniquely related to impairments in fear recognition. Antisocial behaviour was uniquely associated with better fear recognition, but impaired anger recognition. Emotional problems were associated with better recognition of anger and sadness, but lower recognition of neutral faces. Maltreatment was predictive of superior recognition of fear and sadness. The findings are considered in terms of social information-processing theories of psychopathology. Implications for clinical interventions are discussed.
Social learning modulates the lateralization of emotional valence.
Shamay-Tsoory, Simone G; Lavidor, Michal; Aharon-Peretz, Judith
2008-08-01
Although neuropsychological studies of lateralization of emotion have emphasized valence (positive vs. negative) or type (basic vs. complex) dimensions, the interaction between the two dimensions has yet to be elucidated. The purpose of the current study was to test the hypothesis that recognition of basic emotions is processed preferentially by the right prefrontal cortex (PFC), whereas recognition of complex social emotions is processed preferentially by the left PFC. Experiment 1 assessed the ability of healthy controls and patients with right and left PFC lesions to recognize basic and complex emotions. Experiment 2 modeled the patient's data of Experiment 1 on healthy participants under lateralized displays of the emotional stimuli. Both experiments support the Type as well as the Valence Hypotheses. However, our findings indicate that the Valence Hypothesis holds for basic but less so for complex emotions. It is suggested that, since social learning overrules the basic preference of valence in the hemispheres, the processing of complex emotions in the hemispheres is less affected by valence.
Luebbe, Aaron M; Fussner, Lauren M; Kiel, Elizabeth J; Early, Martha C; Bell, Debora J
2013-12-01
Depressive symptomatology is associated with impaired recognition of emotion. Previous investigations have predominantly focused on emotion recognition of static facial expressions neglecting the influence of social interaction and critical contextual factors. In the current study, we investigated how youth and maternal symptoms of depression may be associated with emotion recognition biases during familial interactions across distinct contextual settings. Further, we explored if an individual's current emotional state may account for youth and maternal emotion recognition biases. Mother-adolescent dyads (N = 128) completed measures of depressive symptomatology and participated in three family interactions, each designed to elicit distinct emotions. Mothers and youth completed state affect ratings pertaining to self and other at the conclusion of each interaction task. Using multiple regression, depressive symptoms in both mothers and adolescents were associated with biased recognition of both positive affect (i.e., happy, excited) and negative affect (i.e., sadness, anger, frustration); however, this bias emerged primarily in contexts with a less strong emotional signal. Using actor-partner interdependence models, results suggested that youth's own state affect accounted for depression-related biases in their recognition of maternal affect. State affect did not function similarly in explaining depression-related biases for maternal recognition of adolescent emotion. Together these findings suggest a similar negative bias in emotion recognition associated with depressive symptoms in both adolescents and mothers in real-life situations, albeit potentially driven by different mechanisms.
Impaired recognition of happy facial expressions in bipolar disorder.
Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M
2014-08-01
The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.
Facial recognition: a cognitive study of elderly dementia patients and normal older adults.
Zandi, T; Cooper, M; Garrison, L
1992-01-01
Dementia patients' and normal elderlies' recognition of familiar, ordinary emotional and facial expressions was tested. In three conditions subjects were required to name the emotions depicted in pictures and to produce them while presented with the verbal labels of the expressions. The dementia patients' best performance occurred when they had access to the verbal labels while viewing the pictures. The major deficiency in facial recognition was found to be dysnomia related. Findings of this study suggest that the connection between the gnostic units of expression and the gnostic units of verbal labeling is not impaired significantly among the dementia patients.
Thonse, Umesh; Behere, Rishikesh V; Praharaj, Samir Kumar; Sharma, Podila Sathya Venkata Narasimha
2018-06-01
Facial emotion recognition deficits have been consistently demonstrated in patients with severe mental disorders. Expressed emotion is found to be an important predictor of relapse. However, the relationship between facial emotion recognition abilities and expressed emotions and its influence on socio-occupational functioning in schizophrenia versus bipolar disorder has not been studied. In this study we examined 91 patients with schizophrenia and 71 with bipolar disorder for psychopathology, socio occupational functioning and emotion recognition abilities. Primary caregivers of 62 patients with schizophrenia and 49 with bipolar disorder were assessed on Family Attitude Questionnaire to assess their expressed emotions. Patients of schizophrenia and bipolar disorder performed similarly on the emotion recognition task. Patients with schizophrenia group experienced higher critical comments and had a poorer socio-occupational functioning as compared to patients with bipolar disorder. Poorer socio-occupational functioning in patients with schizophrenia was significantly associated with greater dissatisfaction in their caregivers. In patients with bipolar disorder, poorer emotion recognition scores significantly correlated with poorer adaptive living skills and greater hostility and dissatisfaction in their caregivers. The findings of our study suggest that emotion recognition abilities in patients with bipolar disorder are associated with negative expressed emotions leading to problems in adaptive living skills. Copyright © 2018 Elsevier B.V. All rights reserved.
Social approach and emotion recognition in fragile X syndrome.
Williams, Tracey A; Porter, Melanie A; Langdon, Robyn
2014-03-01
Evidence is emerging that individuals with Fragile X syndrome (FXS) display emotion recognition deficits, which may contribute to their significant social difficulties. The current study investigated the emotion recognition abilities, and social approachability judgments, of FXS individuals when processing emotional stimuli. Relative to chronological age- (CA-) and mental age- (MA-) matched controls, the FXS group performed significantly more poorly on the emotion recognition tasks, and displayed a bias towards detecting negative emotions. Moreover, after controlling for emotion recognition deficits, the FXS group displayed significantly reduced ratings of social approachability. These findings suggest that a social anxiety pattern, rather than poor socioemotional processing, may best explain the social avoidance observed in FXS.
EMOTION RECOGNITION OF VIRTUAL AGENTS FACIAL EXPRESSIONS: THE EFFECTS OF AGE AND EMOTION INTENSITY
Beer, Jenay M.; Fisk, Arthur D.; Rogers, Wendy A.
2014-01-01
People make determinations about the social characteristics of an agent (e.g., robot or virtual agent) by interpreting social cues displayed by the agent, such as facial expressions. Although a considerable amount of research has been conducted investigating age-related differences in emotion recognition of human faces (e.g., Sullivan, & Ruffman, 2004), the effect of age on emotion identification of virtual agent facial expressions has been largely unexplored. Age-related differences in emotion recognition of facial expressions are an important factor to consider in the design of agents that may assist older adults in a recreational or healthcare setting. The purpose of the current research was to investigate whether age-related differences in facial emotion recognition can extend to emotion-expressive virtual agents. Younger and older adults performed a recognition task with a virtual agent expressing six basic emotions. Larger age-related differences were expected for virtual agents displaying negative emotions, such as anger, sadness, and fear. In fact, the results indicated that older adults showed a decrease in emotion recognition accuracy for a virtual agent's emotions of anger, fear, and happiness. PMID:25552896
Kim, Youl-Ri; Eom, Jin-Sup; Yang, Jae-Won; Kang, Jiwon; Treasure, Janet
2015-01-01
Social difficulties and problems related to eating behaviour are common features of both anorexia nervosa (AN) and bulimia nervosa (BN). The aim of this study was to examine the impact of intranasal oxytocin on consummatory behaviour and emotional recognition in patients with AN and BN in comparison to healthy controls. A total of 102 women, including 35 patients with anorexia nervosa (AN), 34 patients with bulimia nervosa (BN), and 33 healthy university students of comparable age and intelligence, participated in a double-blind, single dose placebo-controlled cross-over study. A single dose of intranasal administration of oxytocin (40 IU) (or a placebo) was followed by an emotional recognition task and an apple juice drink. Food intake was then recorded for 24 hours post-test. Oxytocin produced no significant change in appetite in the acute or 24 hours free living settings in healthy controls, whereas there was a decrease in calorie consumption over 24 hours in patients with BN. Oxytocin produced a small increase in emotion recognition sensitivity in healthy controls and in patients with BN, In patients with AN, oxytocin had no effect on emotion recognition sensitivity or on consummatory behaviour. The impact of oxytocin on appetite and social cognition varied between people with AN and BN. A single dose of intranasal oxytocin decreased caloric intake over 24 hours in people with BN. People with BN showed enhanced emotional sensitivity under oxytocin condition similar to healthy controls. Those effects of oxytocin were not found in patients with AN. ClinicalTrials.gov KCT00000716.
Comparison of emotion recognition from facial expression and music.
Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija
2011-01-01
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.
Dolder, Patrick C; Holze, Friederike; Liakoni, Evangelia; Harder, Samuel; Schmid, Yasmin; Liechti, Matthias E
2017-01-01
Social cognition influences social interactions. Alcohol reportedly facilitates social interactions. However, the acute effects of alcohol on social cognition are relatively poorly studied. We investigated the effects of alcoholic or non-alcoholic beer on emotion recognition, empathy, and sexual arousal using the dynamic face emotion recognition task (FERT), Multifaceted Empathy Test (MET), and Sexual Arousal Task (SAT) in a double-blind, random-order, cross-over study in 60 healthy social drinkers. We also assessed subjective effects using visual analog scales (VASs), blood alcohol concentrations, and plasma oxytocin levels. Alcohol increased VAS ratings of stimulated, happy, talkative, open, and want to be with others. The subjective effects of alcohol were greater in participants with higher trait inhibitedness. Alcohol facilitated the recognition of happy faces on the FERT and enhanced emotional empathy for positive stimuli on the MET, particularly in participants with low trait empathy. Pictures of explicit sexual content were rated as less pleasant than neutral pictures after non-alcoholic beer but not after alcoholic beer. Explicit sexual pictures were rated as more pleasant after alcoholic beer compared with non-alcoholic beer, particularly in women. Alcohol did not alter the levels of circulating oxytocin. Alcohol biased emotion recognition toward better decoding of positive emotions and increased emotional concern for positive stimuli. No support was found for a modulatory role of oxytocin. Alcohol also facilitated the viewing of sexual images, consistent with disinhibition, but it did not actually enhance sexual arousal. These effects of alcohol on social cognition likely enhance sociability. www.clinicaltrials.gov/ct2/show/NCT02318823.
MDMA enhances emotional empathy and prosocial behavior
Hysek, Cédric M.; Schmid, Yasmin; Simmler, Linda D.; Domes, Gregor; Heinrichs, Markus; Eisenegger, Christoph; Preller, Katrin H.; Quednow, Boris B.
2014-01-01
3,4-Methylenedioxymethamphetamine (MDMA, ‘ecstasy’) releases serotonin and norepinephrine. MDMA is reported to produce empathogenic and prosocial feelings. It is unknown whether MDMA in fact alters empathic concern and prosocial behavior. We investigated the acute effects of MDMA using the Multifaceted Empathy Test (MET), dynamic Face Emotion Recognition Task (FERT) and Social Value Orientation (SVO) test. We also assessed effects of MDMA on plasma levels of hormones involved in social behavior using a placebo-controlled, double-blind, random-order, cross-over design in 32 healthy volunteers (16 women). MDMA enhanced explicit and implicit emotional empathy in the MET and increased prosocial behavior in the SVO test in men. MDMA did not alter cognitive empathy in the MET but impaired the identification of negative emotions, including fearful, angry and sad faces, in the FERT, particularly in women. MDMA increased plasma levels of cortisol and prolactin, which are markers of serotonergic and noradrenergic activity, and of oxytocin, which has been associated with prosocial behavior. In summary, MDMA sex-specifically altered the recognition of emotions, emotional empathy and prosociality. These effects likely enhance sociability when MDMA is used recreationally and may be useful when MDMA is administered in conjunction with psychotherapy in patients with social dysfunction or post-traumatic stress disorder. PMID:24097374
Balconi, M; Cobelli, C
2015-02-26
The present research explored the cortical correlates of emotional memories in response to words and pictures. Subjects' performance (Accuracy Index, AI; response times, RTs; RTs/AI) was considered when a repetitive Transcranial Magnetic Stimulation (rTMS) was applied on the left dorsolateral prefrontal cortex (LDLPFC). Specifically, the role of LDLPFC was tested by performing a memory task, in which old (previously encoded targets) and new (previously not encoded distractors) emotional pictures/words had to be recognized. Valence (positive vs. negative) and arousing power (high vs. low) of stimuli were also modulated. Moreover, subjective evaluation of emotional stimuli in terms of valence/arousal was explored. We found significant performance improving (higher AI, reduced RTs, improved general performance) in response to rTMS. This "better recognition effect" was only related to specific emotional features, that is positive high arousal pictures or words. Moreover no significant differences were found between stimulus categories. A direct relationship was also observed between subjective evaluation of emotional cues and memory performance when rTMS was applied to LDLPFC. Supported by valence and approach model of emotions, we supposed that a left lateralized prefrontal system may induce a better recognition of positive high arousal words, and that evaluation of emotional cue is related to prefrontal activation, affecting the recognition memories of emotions. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.
Scotland, Jennifer L; McKenzie, Karen; Cossar, Jill; Murray, Aja; Michie, Amanda
2016-01-01
This study aimed to evaluate the emotion recognition abilities of adults (n=23) with an intellectual disability (ID) compared with a control group of children (n=23) without ID matched for estimated cognitive ability. The study examined the impact of: task paradigm, stimulus type and preferred processing style (global/local) on accuracy. We found that, after controlling for estimated cognitive ability, the control group performed significantly better than the individuals with ID. This provides some support for the emotion specificity hypothesis. Having a more local processing style did not significantly mediate the relation between having ID and emotion recognition, but did significantly predict emotion recognition ability after controlling for group. This suggests that processing style is related to emotion recognition independently of having ID. The availability of contextual information improved emotion recognition for people with ID when compared with line drawing stimuli, and identifying a target emotion from a choice of two was relatively easier for individuals with ID, compared with the other task paradigms. The results of the study are considered in the context of current theories of emotion recognition deficits in individuals with ID. Copyright © 2015 Elsevier Ltd. All rights reserved.
Brennan, Sean; McLoughlin, Declan M; O'Connell, Redmond; Bogue, John; O'Connor, Stephanie; McHugh, Caroline; Glennon, Mark
2017-05-01
Transcranial direct current stimulation (tDCS) can enhance a range of neuropsychological functions but its efficacy in addressing clinically significant emotion recognition deficits associated with depression is largely untested. A randomized crossover placebo controlled study was used to investigate the effects of tDCS over the left dorsolateral prefrontal cortex (L-DLPFC) on a range of neuropsychological variables associated with depression as well as neural activity in the associated brain region. A series of computerized tests was administered to clinical (n = 17) and control groups (n = 20) during sham and anodal (1.5 mA) stimulation. Anodal tDCS led to a significant main effect for overall emotion recognition (p = .02), with a significant improvement in the control group (p = .04). Recognition of disgust was significantly greater in the clinical group (p = .01). Recognition of anger was significantly improved for the clinical group (p = .04) during anodal stimulation. Differences between groups for each of the six emotions at varying levels of expression found that at 40% during anodal stimulation, happy recognition significantly improved for the clinical group (p = .01). Anger recognition at 80% during anodal stimulation significantly improved for the clinical group (p = .02). These improvements were observed in the absence of any change in psychomotor speed or trail making ability during anodal stimulation. Working memory significantly improved during anodal stimulation for the clinical group but not for controls (p = .03). The tentative findings of this study indicate that tDCS can have a neuromodulatory effect on a range of neuropsychological variables. However, it is clear that there was a wide variation in responses to tDCS and that individual difference and different approaches to testing and stimulation have a significant impact on final outcomes. Nonetheless, tDCS remains a promising tool for future neuropsychological research.
Wirkner, Janine; Weymar, Mathias; Löw, Andreas; Hamm, Alfons O.
2013-01-01
Recent animal and human research indicates that stress around the time of encoding enhances long-term memory for emotionally arousing events but neural evidence remains unclear. In the present study we used the ERP old/new effect to investigate brain dynamics underlying the long-term effects of acute pre-encoding stress on memory for emotional and neutral scenes. Participants were exposed either to the Socially Evaluated Cold Pressure Test (SECPT) or a warm water control procedure before viewing 30 unpleasant, 30 neutral and 30 pleasant pictures. Two weeks after encoding, recognition memory was tested using 90 old and 90 new pictures. Emotional pictures were better recognized than neutral pictures in both groups and related to an enhanced centro-parietal ERP old/new difference (400–800 ms) during recognition, which suggests better recollection. Most interestingly, pre-encoding stress exposure specifically increased the ERP old/new-effect for emotional (unpleasant) pictures, but not for neutral pictures. These enhanced ERP/old new differences for emotional (unpleasant) scenes were particularly pronounced for those participants who reported high levels of stress during the SECPT. The results suggest that acute pre-encoding stress specifically strengthens brain signals of emotional memories, substantiating a facilitating role of stress on memory for emotional scenes. PMID:24039697
Younger and Older Users’ Recognition of Virtual Agent Facial Expressions
Beer, Jenay M.; Smarr, Cory-Ann; Fisk, Arthur D.; Rogers, Wendy A.
2015-01-01
As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults’ ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition. PMID:25705105
Menstrual-cycle dependent fluctuations in ovarian hormones affect emotional memory.
Bayer, Janine; Schultz, Heidrun; Gamer, Matthias; Sommer, Tobias
2014-04-01
The hormones progesterone and estradiol modulate neural plasticity in the hippocampus, the amygdala and the prefrontal cortex. These structures are involved in the superior memory for emotionally arousing information (EEM effects). Therefore, fluctuations in hormonal levels across the menstrual cycle are expected to influence activity in these areas as well as behavioral memory performance for emotionally arousing events. To test this hypothesis, naturally cycling women underwent functional magnetic resonance imaging during the encoding of emotional and neutral stimuli in the low-hormone early follicular and the high-hormone luteal phase. Their memory was tested after an interval of 48 h, because emotional arousal primarily enhances the consolidation of new memories. Whereas overall recognition accuracy remained stable across cycle phases, recognition quality varied with menstrual cycle phases. Particularly recollection-based recognition memory for negative items tended to decrease from early follicular to luteal phase. EEM effects for both valences were associated with higher activity in the right anterior hippocampus during early follicular compared to luteal phase. Valence-specific modulations were found in the anterior cingulate, the amygdala and the posterior hippocampus. Current findings connect to anxiolytic actions of estradiol and progesterone as well as to studies on fear conditioning. Moreover, they are in line with differential networks involved in EEM effects for positive and negative items. Copyright © 2014 Elsevier Inc. All rights reserved.
Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James
2017-01-01
Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives. PMID:28068393
Wingenbach, Tanja S. H.; Brosnan, Mark; Pfaltz, Monique C.; Plichta, Michael M.; Ashwin, Chris
2018-01-01
According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others’ faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed. PMID:29928240
Wingenbach, Tanja S H; Brosnan, Mark; Pfaltz, Monique C; Plichta, Michael M; Ashwin, Chris
2018-01-01
According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.
Emotion recognition and social adjustment in school-aged girls and boys.
Leppänen, J M; Hietanen, J K
2001-12-01
The present study investigated emotion recognition accuracy and its relation to social adjustment in 7-10 year-old children. The ability to recognize basic emotions from facial and vocal expressions was measured and compared to peer popularity and to teacher-rated social competence. The results showed that emotion recognition was related to these measures of social adjustment, but the gender of a child and emotion category affected this relationship. Emotion recognition accuracy was significantly related to social adjustment for the girls, but not for the boys. For the girls, especially the recognition of surprise was related to social adjustment. Together, these results suggest that the ability to recognize others' emotional states from nonverbal cues is an important socio-cognitive ability for school-aged girls.
Problems of Face Recognition in Patients with Behavioral Variant Frontotemporal Dementia.
Chandra, Sadanandavalli Retnaswami; Patwardhan, Ketaki; Pai, Anupama Ramakanth
2017-01-01
Faces are very special as they are most essential for social cognition in humans. It is partly understood that face processing in its abstractness involves several extra striate areas. One of the most important causes for caregiver suffering in patients with anterior dementia is lack of empathy. This apart from being a behavioral disorder could be also due to failure to categorize the emotions of the people around them. Inlusion criteria: DSM IV for Bv FTD Tested for prosopagnosia - familiar faces, famous face, smiling face, crying face and reflected face using a simple picture card (figure 1). Advanced illness and mixed causes. 46 patients (15 females, 31 males) 24 had defective face recognition. (mean age 51.5),10/15 females (70%) and 14/31males(47. Familiar face recognition defect was found in 6/10 females and 6/14 males. Total- 40%(6/15) females and 19.35%(6/31)males with FTD had familiar face recognition. Famous Face: 9/10 females and 7/14 males. Total- 60% (9/15) females with FTD had famous face recognition defect as against 22.6%(7/31) males with FTD Smiling face defects in 8/10 female and no males. Total- 53.33% (8/15) females. Crying face recognition defect in 3/10 female and 2 /14 males. Total- 20%(3/15) females and 6.5%(2/31) males. Reflected face recognition defect in 4 females. Famous face recognition and positive emotion recognition defect in 80%, only 20% comprehend positive emotions, Face recognition defects are found in only 45% of males and more common in females. Face recognition is more affected in females with FTD There is differential involvement of different aspects of the face recognition could be one of the important factor underlying decline in the emotional and social behavior of these patients. Understanding these pathological processes will give more insight regarding patient behavior.
Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long
2013-10-30
This study assessed facial emotion recognition abilities in subjects with paranoid and non-paranoid schizophrenia (NPS) using signal detection theory. We explore the differential deficits in facial emotion recognition in 44 paranoid patients with schizophrenia (PS) and 30 non-paranoid patients with schizophrenia (NPS), compared to 80 healthy controls. We used morphed faces with different intensities of emotion and computed the sensitivity index (d') of each emotion. The results showed that performance differed between the schizophrenia and healthy controls groups in the recognition of both negative and positive affects. The PS group performed worse than the healthy controls group but better than the NPS group in overall performance. Performance differed between the NPS and healthy controls groups in the recognition of all basic emotions and neutral faces; between the PS and healthy controls groups in the recognition of angry faces; and between the PS and NPS groups in the recognition of happiness, anger, sadness, disgust, and neutral affects. The facial emotion recognition impairment in schizophrenia may reflect a generalized deficit rather than a negative-emotion specific deficit. The PS group performed worse than the control group, but better than the NPS group in facial expression recognition, with differential deficits between PS and NPS patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Drapeau, Joanie; Gosselin, Nathalie; Peretz, Isabelle; McKerral, Michelle
2017-01-01
To assess emotion recognition from dynamic facial, vocal and musical expressions in sub-groups of adults with traumatic brain injuries (TBI) of different severities and identify possible common underlying mechanisms across domains. Forty-one adults participated in this study: 10 with moderate-severe TBI, nine with complicated mild TBI, 11 with uncomplicated mild TBI and 11 healthy controls, who were administered experimental (emotional recognition, valence-arousal) and control tasks (emotional and structural discrimination) for each domain. Recognition of fearful faces was significantly impaired in moderate-severe and in complicated mild TBI sub-groups, as compared to those with uncomplicated mild TBI and controls. Effect sizes were medium-large. Participants with lower GCS scores performed more poorly when recognizing fearful dynamic facial expressions. Emotion recognition from auditory domains was preserved following TBI, irrespective of severity. All groups performed equally on control tasks, indicating no perceptual disorders. Although emotional recognition from vocal and musical expressions was preserved, no correlation was found across auditory domains. This preliminary study may contribute to improving comprehension of emotional recognition following TBI. Future studies of larger samples could usefully include measures of functional impacts of recognition deficits for fearful facial expressions. These could help refine interventions for emotional recognition following a brain injury.
Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L
2017-03-01
Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).
Enhanced emotional empathy after psychosocial stress in young healthy men.
Wolf, Oliver T; Schulte, Judith M; Drimalla, Hanna; Hamacher-Dang, Tanja C; Knoch, Daria; Dziobek, Isabel
2015-01-01
Empathy is a core prerequisite for human social behavior. Relatively, little is known about how empathy is influenced by social stress and its associated neuroendocrine alterations. The current study was designed to test the impact of acute stress on emotional and cognitive empathy. Healthy male participants were exposed to a psychosocial laboratory stressor (trier social stress test, (TSST)) or a well-matched control condition (Placebo-TSST). Afterwards they participated in an empathy test measuring emotional and cognitive empathy (multifaceted empathy test, (MET)). Stress exposure caused an increase in negative affect, a rise in salivary alpha amylase and a rise in cortisol. Participants exposed to stress reported more emotional empathy in response to pictures displaying both positive and negative emotional social scenes. Cognitive empathy (emotion recognition) in contrast did not differ between the stress and the control group. The current findings provide initial evidence for enhanced emotional empathy after acute psychosocial stress.
Survey of Technologies for the Airport Border of the Future
2014-04-01
geometry Handwriting recognition ID cards Image classification Image enhancement Image fusion Image matching Image processing Image segmentation Iris...00 Tongue print Footstep recognition Odour recognition Retinal recognition Emotion recognition Periocular recognition Handwriting recognition Ear...recognition Palmprint recognition Hand geometry DNA matching Vein matching Ear recognition Handwriting recognition Periocular recognition Emotion
Bora, Emre; Velakoulis, Dennis; Walterfang, Mark
2016-07-01
Behavioral disturbances and lack of empathy are distinctive clinical features of behavioral variant frontotemporal dementia (bvFTD) in comparison to Alzheimer disease (AD). The aim of this meta-analytic review was to compare facial emotion recognition performances of bvFTD with healthy controls and AD. The current meta-analysis included a total of 19 studies and involved comparisons of 288 individuals with bvFTD and 329 healthy controls and 162 bvFTD and 147 patients with AD. Facial emotion recognition was significantly impaired in bvFTD in comparison to the healthy controls (d = 1.81) and AD (d = 1.23). In bvFTD, recognition of negative emotions, especially anger (d = 1.48) and disgust (d = 1.41), were severely impaired. Emotion recognition was significantly impaired in bvFTD in comparison to AD in all emotions other than happiness. Impairment of emotion recognition is a relatively specific feature of bvFTD. Routine assessment of social-cognitive abilities including emotion recognition can be helpful in better differentiating between cortical dementias such as bvFTD and AD. © The Author(s) 2016.
Food-Induced Emotional Resonance Improves Emotion Recognition.
Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia
2016-01-01
The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce-which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one.
Food-Induced Emotional Resonance Improves Emotion Recognition
Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia
2016-01-01
The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce—which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one. PMID:27973559
Lin, Chia-Yao; Tien, Yi-Min; Huang, Jong-Tsun; Tsai, Chon-Haw; Hsu, Li-Chuan
2016-01-01
Because of dopaminergic neurodegeneration, patients with Parkinson's disease (PD) show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment 1 and identify gender in Experiment 2. In Experiment 1, PD patients demonstrated a recognition deficit for negative (sadness and anger) and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment 2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment 2 as alternative explanations for the results of Experiment 1. We concluded that patients' ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.
Tien, Yi-Min; Huang, Jong-Tsun
2016-01-01
Because of dopaminergic neurodegeneration, patients with Parkinson's disease (PD) show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment 1 and identify gender in Experiment 2. In Experiment 1, PD patients demonstrated a recognition deficit for negative (sadness and anger) and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment 2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment 2 as alternative explanations for the results of Experiment 1. We concluded that patients' ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions. PMID:27555668
Parents’ Emotion-Related Beliefs, Behaviors, and Skills Predict Children's Recognition of Emotion
Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.
2015-01-01
Children who are able to recognize others’ emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents’ own emotion-related beliefs, behaviors, and skills. We examined parents’ beliefs about the value of emotion and guidance of children's emotion, parents’ emotion labeling and teaching behaviors, and parents’ skill in recognizing children's emotions in relation to their school-aged children's emotion recognition skills. Sixty-nine parent-child dyads completed questionnaires, participated in dyadic laboratory tasks, and identified their own emotions and emotions felt by the other participant from videotaped segments. Regression analyses indicate that parents’ beliefs, behaviors, and skills together account for 37% of the variance in child emotion recognition ability, even after controlling for parent and child expressive clarity. The findings suggest the importance of the family milieu in the development of children's emotion recognition skill in middle childhood, and add to accumulating evidence suggesting important age-related shifts in the relation between parental emotion socialization and child emotional development. PMID:26005393
Actively paranoid patients with schizophrenia over attribute anger to neutral faces.
Pinkham, Amy E; Brensinger, Colleen; Kohler, Christian; Gur, Raquel E; Gur, Ruben C
2011-02-01
Previous investigations of the influence of paranoia on facial affect recognition in schizophrenia have been inconclusive as some studies demonstrate better performance for paranoid relative to non-paranoid patients and others show that paranoid patients display greater impairments. These studies have been limited by small sample sizes and inconsistencies in the criteria used to define groups. Here, we utilized an established emotion recognition task and a large sample to examine differential performance in emotion recognition ability between patients who were actively paranoid (AP) and those who were not actively paranoid (NAP). Accuracy and error patterns on the Penn Emotion Recognition test (ER40) were examined in 132 patients (64 NAP and 68 AP). Groups were defined based on the presence of paranoid ideation at the time of testing rather than diagnostic subtype. AP and NAP patients did not differ in overall task accuracy; however, an emotion by group interaction indicated that AP patients were significantly worse than NAP patients at correctly labeling neutral faces. A comparison of error patterns on neutral stimuli revealed that the groups differed only in misattributions of anger expressions, with AP patients being significantly more likely to misidentify a neutral expression as angry. The present findings suggest that paranoia is associated with a tendency to over attribute threat to ambiguous stimuli and also lend support to emerging hypotheses of amygdala hyperactivation as a potential neural mechanism for paranoid ideation. Copyright © 2010 Elsevier B.V. All rights reserved.
Sensory Contributions to Impaired Emotion Processing in Schizophrenia
Butler, Pamela D.; Abeles, Ilana Y.; Weiskopf, Nicole G.; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E.; Zemon, Vance; Loughead, James; Gur, Ruben C.; Javitt, Daniel C.
2009-01-01
Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective. PMID:19793797
Sensory contributions to impaired emotion processing in schizophrenia.
Butler, Pamela D; Abeles, Ilana Y; Weiskopf, Nicole G; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E; Zemon, Vance; Loughead, James; Gur, Ruben C; Javitt, Daniel C
2009-11-01
Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective.
Facial emotion recognition and borderline personality pathology.
Meehan, Kevin B; De Panfilis, Chiara; Cain, Nicole M; Antonucci, Camilla; Soliani, Antonio; Clarkin, John F; Sambataro, Fabio
2017-09-01
The impact of borderline personality pathology on facial emotion recognition has been in dispute; with impaired, comparable, and enhanced accuracy found in high borderline personality groups. Discrepancies are likely driven by variations in facial emotion recognition tasks across studies (stimuli type/intensity) and heterogeneity in borderline personality pathology. This study evaluates facial emotion recognition for neutral and negative emotions (fear/sadness/disgust/anger) presented at varying intensities. Effortful control was evaluated as a moderator of facial emotion recognition in borderline personality. Non-clinical multicultural undergraduates (n = 132) completed a morphed facial emotion recognition task of neutral and negative emotional expressions across different intensities (100% Neutral; 25%/50%/75% Emotion) and self-reported borderline personality features and effortful control. Greater borderline personality features related to decreased accuracy in detecting neutral faces, but increased accuracy in detecting negative emotion faces, particularly at low-intensity thresholds. This pattern was moderated by effortful control; for individuals with low but not high effortful control, greater borderline personality features related to misattributions of emotion to neutral expressions, and enhanced detection of low-intensity emotional expressions. Individuals with high borderline personality features may therefore exhibit a bias toward detecting negative emotions that are not or barely present; however, good self-regulatory skills may protect against this potential social-cognitive vulnerability. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Social Approach and Emotion Recognition in Fragile X Syndrome
ERIC Educational Resources Information Center
Williams, Tracey A.; Porter, Melanie A.; Langdon, Robyn
2014-01-01
Evidence is emerging that individuals with Fragile X syndrome (FXS) display emotion recognition deficits, which may contribute to their significant social difficulties. The current study investigated the emotion recognition abilities, and social approachability judgments, of FXS individuals when processing emotional stimuli. Relative to…
Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.
Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life.
Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone
Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life. PMID:24027541
Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark
2018-01-01
There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.
Sex differences in facial emotion recognition across varying expression intensity levels from videos
2018-01-01
There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or ‘extreme’ examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations. PMID:29293674
More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder
Goghari, Vina M; Sponheim, Scott R
2012-01-01
Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816
Öztürk, Ahmet; Kiliç, Alperen; Deveci, Erdem; Kirpinar, İsmet
2016-01-01
Background The concept of facial emotion recognition is well established in various neuropsychiatric disorders. Although emotional disturbances are strongly associated with somatoform disorders, there are a restricted number of studies that have investigated facial emotion recognition in somatoform disorders. Furthermore, there have been no studies that have regarded this issue using the new diagnostic criteria for somatoform disorders as somatic symptoms and related disorders (SSD). In this study, we aimed to compare the factors of facial emotion recognition between patients with SSD and age- and sex-matched healthy controls (HC) and to retest and investigate the factors of facial emotion recognition using the new criteria for SSD. Patients and methods After applying the inclusion and exclusion criteria, 54 patients who were diagnosed with SSD according to the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) criteria and 46 age- and sex-matched HC were selected to participate in the present study. Facial emotion recognition, alexithymia, and the status of anxiety and depression were compared between the groups. Results Patients with SSD had significantly decreased scores of facial emotion for fear faces, disgust faces, and neutral faces compared with age- and sex-matched HC (t=−2.88, P=0.005; t=−2.86, P=0.005; and t=−2.56, P=0.009, respectively). After eliminating the effects of alexithymia and depressive and anxious states, the groups were found to be similar in terms of their responses to facial emotion and mean reaction time to facial emotions. Discussion Although there have been limited numbers of studies that have examined the recognition of facial emotion in patients with somatoform disorders, our study is the first to investigate facial recognition in patients with SSD diagnosed according to the DSM-5 criteria. Recognition of facial emotion was found to be disturbed in patients with SSD. However, our findings suggest that disturbances in facial recognition were significantly associated with alexithymia and the status of depression and anxiety, which is consistent with the previous studies. Further studies are needed to highlight the associations between facial emotion recognition and SSD. PMID:27199559
Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki
2013-09-01
This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.
Family environment influences emotion recognition following paediatric traumatic brain injury.
Schmidt, Adam T; Orsten, Kimberley D; Hanten, Gerri R; Li, Xiaoqi; Levin, Harvey S
2010-01-01
This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Findings suggest family functioning variables--especially financial resources--can influence performance on an emotional processing task following TBI in children.
Felisberti, Fatima; Terry, Philip
2015-09-01
The study compared alcohol's effects on the recognition of briefly displayed facial expressions of emotion (so-called microexpressions) with expressions presented for a longer period. Using a repeated-measures design, we tested 18 participants three times (counterbalanced), after (i) a placebo drink, (ii) a low-to-moderate dose of alcohol (0.17 g/kg women; 0.20 g/kg men) and (iii) a moderate-to-high dose of alcohol (0.52 g/kg women; 0.60 g/kg men). On each session, participants were presented with stimuli representing six emotions (happiness, sadness, anger, fear, disgust and contempt) overlaid on a generic avatar in a six-alternative forced-choice paradigm. A neutral expression (1 s) preceded and followed a target expression presented for 200 ms (microexpressions) or 400 ms. Participants mouse clicked the correct answer. The recognition of disgust was significantly better after the high dose of alcohol than after the low dose or placebo drinks at both durations of stimulus presentation. A similar profile of effects was found for the recognition of contempt. There were no effects on response latencies. Alcohol can increase sensitivity to expressions of disgust and contempt. Such effects are not dependent on stimulus duration up to 400 ms and may reflect contextual modulation of alcohol's effects on emotion recognition. Copyright © 2015 John Wiley & Sons, Ltd.
Subthreshold social cognitive deficits may be a key to distinguish 22q11.2DS from schizophrenia.
Peyroux, Elodie; Rigard, Caroline; Saucourt, Guillaume; Poisson, Alice; Plasse, Julien; Franck, Nicolas; Demily, Caroline
2018-03-25
Social cognitive impairments are core features in 22q11.2 deletion syndrome (22q11.2DS) and schizophrenia (SCZ). Indeed, adults with 22q.11.2 DS often have poorer social competence as well as poorer performance on measures of social cognitive skills (emotion recognition and theory of mind, ToM) compared with typically developing people. However, studies comparing specific social cognitive components in 22q11.2DS and SCZ have not yet been widely conducted. In this study we compared performances of 22q11.2DS and SCZ on both facial emotion recognition and ToM. Patients with 22q11.2DS (n = 18) and matched SCZ patients were recruited. After neuropsychological testing, the facial emotion recognition test assessed the patients' ability to recognize six basic, universal emotions (joy, anger, sadness, fear, disgust, and contempt). The Versailles-situational intentional reading evaluated ToM with six scenes from movies showing characters in complex interactions (involving hints, lies, and indirect speech). We show that 22q11.2DS exhibited significantly lower performance in emotion recognition than SCZ patients did, especially for disgust, contempt, and fear. This impairment seems to be a core cognitive phenotype in 22q11.2DS, regardless of the presence of SCZ symptoms. Concerning ToM, our results may highlight the same impairment level in 22q11.2DS and SCZ but require to be replicated in a larger cohort. Our results document the existence of threshold social cognitive deficits distinguishing 22q11.2DS from SCZ. © 2018 John Wiley & Sons Australia, Ltd.
Effects of Oxytocin on Neural Response to Facial Expressions in Patients with Schizophrenia
Shin, Na Young; Park, Hye Yoon; Jung, Wi Hoon; Park, Jin Woo; Yun, Je-Yeon; Jang, Joon Hwan; Kim, Sung Nyun; Han, Hyun Jung; Kim, So-Yeon; Kang, Do-Hyung; Kwon, Jun Soo
2015-01-01
Impaired facial emotion recognition is a core deficit in schizophrenia. Oxytocin has been shown to improve social perception in patients with schizophrenia; however, the effect of oxytocin on the neural activity underlying facial emotion recognition has not been investigated. This study was aimed to assess the effect of a single dose of intranasal oxytocin on brain activity in patients with schizophrenia using an implicit facial emotion-recognition paradigm. Sixteen male patients with schizophrenia and 16 age-matched healthy male control subjects participated in a randomized, double-blind, placebo-controlled crossover trial at Seoul National University Hospital. Delivery of a single dose of 40 IU intranasal oxytocin and the placebo was separated by 1 week. Drug conditions were compared by performing a region of interest (ROI) analysis of the bilateral amygdala on responses to the emotion recognition test. It was found that nasal spray decreased amygdala activity for fearful emotion and increased activity for happy faces. Further, oxytocin elicited differential effects between the patient and control groups. Intranasal oxytocin attenuated amygdala activity for emotional faces in patients with schizophrenia, whereas intranasal oxytocin significantly increased amygdala activity in healthy controls. Oxytocin-induced BOLD signal changes in amygdala in response to happy faces was related to attachment style in the control group. Our result provides new evidence of a modulatory effect of oxytocin on neural response to emotional faces for patients with schizophrenia. Future studies are needed to investigate the effectiveness of long-term treatment with intranasal oxytocin on neural activity in patients with schizophrenia. PMID:25666311
Effects of Oxytocin on Neural Response to Facial Expressions in Patients with Schizophrenia.
Shin, Na Young; Park, Hye Yoon; Jung, Wi Hoon; Park, Jin Woo; Yun, Je-Yeon; Jang, Joon Hwan; Kim, Sung Nyun; Han, Hyun Jung; Kim, So-Yeon; Kang, Do-Hyung; Kwon, Jun Soo
2015-07-01
Impaired facial emotion recognition is a core deficit in schizophrenia. Oxytocin has been shown to improve social perception in patients with schizophrenia; however, the effect of oxytocin on the neural activity underlying facial emotion recognition has not been investigated. This study was aimed to assess the effect of a single dose of intranasal oxytocin on brain activity in patients with schizophrenia using an implicit facial emotion-recognition paradigm. Sixteen male patients with schizophrenia and 16 age-matched healthy male control subjects participated in a randomized, double-blind, placebo-controlled crossover trial at Seoul National University Hospital. Delivery of a single dose of 40 IU intranasal oxytocin and the placebo was separated by 1 week. Drug conditions were compared by performing a region of interest (ROI) analysis of the bilateral amygdala on responses to the emotion recognition test. It was found that nasal spray decreased amygdala activity for fearful emotion and increased activity for happy faces. Further, oxytocin elicited differential effects between the patient and control groups. Intranasal oxytocin attenuated amygdala activity for emotional faces in patients with schizophrenia, whereas intranasal oxytocin significantly increased amygdala activity in healthy controls. Oxytocin-induced BOLD signal changes in amygdala in response to happy faces was related to attachment style in the control group. Our result provides new evidence of a modulatory effect of oxytocin on neural response to emotional faces for patients with schizophrenia. Future studies are needed to investigate the effectiveness of long-term treatment with intranasal oxytocin on neural activity in patients with schizophrenia.
Buratto, Luciano G.; Pottage, Claire L.; Brown, Charity; Morrison, Catriona M.; Schaefer, Alexandre
2014-01-01
Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated. PMID:25330251
Buratto, Luciano G; Pottage, Claire L; Brown, Charity; Morrison, Catriona M; Schaefer, Alexandre
2014-01-01
Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated.
Effectiveness of Emotion Recognition Training for Young Children with Developmental Delays
ERIC Educational Resources Information Center
Downs, Andrew; Strand, Paul
2008-01-01
Emotion recognition is a basic skill that is thought to facilitate development of social and emotional competence. There is little research available examining whether therapeutic or instructional interventions can improve the emotion recognition skill of young children with various developmental disabilities. Sixteen preschool children with…
Body Emotion Recognition Disproportionately Depends on Vertical Orientations during Childhood
ERIC Educational Resources Information Center
Balas, Benjamin; Auen, Amanda; Saville, Alyson; Schmidt, Jamie
2018-01-01
Children's ability to recognize emotional expressions from faces and bodies develops during childhood. However, the low-level features that support accurate body emotion recognition during development have not been well characterized. This is in marked contrast to facial emotion recognition, which is known to depend upon specific spatial frequency…
Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; Vinogradov, Sophia
2012-08-01
Cognitive remediation training has been shown to improve both cognitive and social cognitive deficits in people with schizophrenia, but the mechanisms that support this behavioral improvement are largely unknown. One hypothesis is that intensive behavioral training in cognition and/or social cognition restores the underlying neural mechanisms that support targeted skills. However, there is little research on the neural effects of cognitive remediation training. This study investigated whether a 50 h (10-week) remediation intervention which included both cognitive and social cognitive training would influence neural function in regions that support social cognition. Twenty-two stable, outpatient schizophrenia participants were randomized to a treatment condition consisting of auditory-based cognitive training (AT) [Brain Fitness Program/auditory module ~60 min/day] plus social cognition training (SCT) which was focused on emotion recognition [~5-15 min per day] or a placebo condition of non-specific computer games (CG) for an equal amount of time. Pre and post intervention assessments included an fMRI task of positive and negative facial emotion recognition, and standard behavioral assessments of cognition, emotion processing, and functional outcome. There were no significant intervention-related improvements in general cognition or functional outcome. fMRI results showed the predicted group-by-time interaction. Specifically, in comparison to CG, AT+SCT participants had a greater pre-to-post intervention increase in postcentral gyrus activity during emotion recognition of both positive and negative emotions. Furthermore, among all participants, the increase in postcentral gyrus activity predicted behavioral improvement on a standardized test of emotion processing (MSCEIT: Perceiving Emotions). Results indicate that combined cognition and social cognition training impacts neural mechanisms that support social cognition skills. Copyright © 2012 Elsevier B.V. All rights reserved.
Hooker, Christine I.; Bruce, Lori; Fisher, Melissa; Verosky, Sara C.; Miyakawa, Asako; Vinogradov, Sophia
2012-01-01
Cognitive remediation training has been shown to improve both cognitive and social-cognitive deficits in people with schizophrenia, but the mechanisms that support this behavioral improvement are largely unknown. One hypothesis is that intensive behavioral training in cognition and/or social-cognition restores the underlying neural mechanisms that support targeted skills. However, there is little research on the neural effects of cognitive remediation training. This study investigated whether a 50 hour (10-week) remediation intervention which included both cognitive and social-cognitive training would influence neural function in regions that support social-cognition. Twenty-two stable, outpatient schizophrenia participants were randomized to a treatment condition consisting of auditory-based cognitive training (AT) [Brain Fitness Program/auditory module ~60 minutes/day] plus social-cognition training (SCT) which was focused on emotion recognition [~5–15 minutes per day] or a placebo condition of non-specific computer games (CG) for an equal amount of time. Pre and post intervention assessments included an fMRI task of positive and negative facial emotion recognition, and standard behavioral assessments of cognition, emotion processing, and functional outcome. There were no significant intervention-related improvements in general cognition or functional outcome. FMRI results showed the predicted group-by-time interaction. Specifically, in comparison to CG, AT+SCT participants had a greater pre-to-post intervention increase in postcentral gyrus activity during emotion recognition of both positive and negative emotions. Furthermore, among all participants, the increase in postcentral gyrus activity predicted behavioral improvement on a standardized test of emotion processing (MSCEIT: Perceiving Emotions). Results indicate that combined cognition and social-cognition training impacts neural mechanisms that support social-cognition skills. PMID:22695257
Facial and prosodic emotion recognition in social anxiety disorder.
Tseng, Huai-Hsuan; Huang, Yu-Lien; Chen, Jian-Ting; Liang, Kuei-Yu; Lin, Chao-Cheng; Chen, Sue-Huei
2017-07-01
Patients with social anxiety disorder (SAD) have a cognitive preference to negatively evaluate emotional information. In particular, the preferential biases in prosodic emotion recognition in SAD have been much less explored. The present study aims to investigate whether SAD patients retain negative evaluation biases across visual and auditory modalities when given sufficient response time to recognise emotions. Thirty-one SAD patients and 31 age- and gender-matched healthy participants completed a culturally suitable non-verbal emotion recognition task and received clinical assessments for social anxiety and depressive symptoms. A repeated measures analysis of variance was conducted to examine group differences in emotion recognition. Compared to healthy participants, SAD patients were significantly less accurate at recognising facial and prosodic emotions, and spent more time on emotion recognition. The differences were mainly driven by the lower accuracy and longer reaction times for recognising fearful emotions in SAD patients. Within the SAD patients, lower accuracy of sad face recognition was associated with higher severity of depressive and social anxiety symptoms, particularly with avoidance symptoms. These findings may represent a cross-modality pattern of avoidance in the later stage of identifying negative emotions in SAD. This pattern may be linked to clinical symptom severity.
Marchewka, Artur; Wypych, Marek; Michałowski, Jarosław M.; Sińczuk, Marcin; Wordecha, Małgorzata; Jednoróg, Katarzyna; Nowicka, Anna
2016-01-01
Studies presenting memory-facilitating effect of emotions typically focused on affective dimensions of arousal and valence. Little is known, however, about the extent to which stimulus-driven basic emotions could have distinct effects on memory. In the present paper we sought to examine the modulatory effect of disgust, fear, and sadness on intentional remembering and forgetting using widely used item-method directed forgetting (DF) paradigm. Eighteen women underwent fMRI scanning during encoding phase in which they were asked either to remember (R) or to forget (F) pictures. In the test phase all previously used stimuli were re-presented together with the same number of new pictures and participants had to categorize them as old or new, irrespective of the F/R instruction. On the behavioral level we found a typical DF effect, i.e., higher recognition rates for to-be-remembered (TBR) items than to-be-forgotten (TBF) ones for both neutral and emotional categories. Emotional stimuli had higher recognition rate than neutral ones, while among emotional those eliciting disgust produced highest recognition, but at the same time induced more false alarms. Therefore, when false alarm corrected recognition was examined the DF effect was equally strong irrespective of emotion. Additionally, even though subjects rated disgusting pictures as more arousing and negative than other picture categories, logistic regression on the item level showed that the effect of disgust on recognition memory was stronger than the effect of arousal or valence. On the neural level, ROI analyses (with valence and arousal covariates) revealed that correctly recognized disgusting stimuli evoked the highest activity in the left amygdala compared to all other categories. This structure was also more activated for remembered vs. forgotten stimuli, but only in case of disgust or fear eliciting pictures. Our findings, despite several limitations, suggest that disgust have a special salience in memory relative to other negative emotions, which cannot be put down to differences in arousal or valence. The current results thereby support the suggestion that a purely dimensional model of emotional influences on cognition might not be adequate to account for observed effects. PMID:27551262
Marchewka, Artur; Wypych, Marek; Michałowski, Jarosław M; Sińczuk, Marcin; Wordecha, Małgorzata; Jednoróg, Katarzyna; Nowicka, Anna
2016-01-01
Studies presenting memory-facilitating effect of emotions typically focused on affective dimensions of arousal and valence. Little is known, however, about the extent to which stimulus-driven basic emotions could have distinct effects on memory. In the present paper we sought to examine the modulatory effect of disgust, fear, and sadness on intentional remembering and forgetting using widely used item-method directed forgetting (DF) paradigm. Eighteen women underwent fMRI scanning during encoding phase in which they were asked either to remember (R) or to forget (F) pictures. In the test phase all previously used stimuli were re-presented together with the same number of new pictures and participants had to categorize them as old or new, irrespective of the F/R instruction. On the behavioral level we found a typical DF effect, i.e., higher recognition rates for to-be-remembered (TBR) items than to-be-forgotten (TBF) ones for both neutral and emotional categories. Emotional stimuli had higher recognition rate than neutral ones, while among emotional those eliciting disgust produced highest recognition, but at the same time induced more false alarms. Therefore, when false alarm corrected recognition was examined the DF effect was equally strong irrespective of emotion. Additionally, even though subjects rated disgusting pictures as more arousing and negative than other picture categories, logistic regression on the item level showed that the effect of disgust on recognition memory was stronger than the effect of arousal or valence. On the neural level, ROI analyses (with valence and arousal covariates) revealed that correctly recognized disgusting stimuli evoked the highest activity in the left amygdala compared to all other categories. This structure was also more activated for remembered vs. forgotten stimuli, but only in case of disgust or fear eliciting pictures. Our findings, despite several limitations, suggest that disgust have a special salience in memory relative to other negative emotions, which cannot be put down to differences in arousal or valence. The current results thereby support the suggestion that a purely dimensional model of emotional influences on cognition might not be adequate to account for observed effects.
Chu, Simon; McNeill, Kimberley; Ireland, Jane L; Qurashi, Inti
2015-12-15
We investigated the relationship between a change in sleep quality and facial emotion recognition accuracy in a group of mentally-disordered inpatients at a secure forensic psychiatric unit. Patients whose sleep improved over time also showed improved facial emotion recognition while patients who showed no sleep improvement showed no change in emotion recognition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.
Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted
2017-07-01
Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Roddy, S; Tiedt, L; Kelleher, I; Clarke, M C; Murphy, J; Rawdon, C; Roche, R A P; Calkins, M E; Richard, J A; Kohler, C G; Cannon, M
2012-10-01
Psychotic symptoms, also termed psychotic-like experiences (PLEs) in the absence of psychotic disorder, are common in adolescents and are associated with increased risk of schizophrenia-spectrum illness in adulthood. At the same time, schizophrenia is associated with deficits in social cognition, with deficits particularly documented in facial emotion recognition (FER). However, little is known about the relationship between PLEs and FER abilities, with only one previous prospective study examining the association between these abilities in childhood and reported PLEs in adolescence. The current study was a cross-sectional investigation of the association between PLEs and FER in a sample of Irish adolescents. The Adolescent Psychotic-Like Symptom Screener (APSS), a self-report measure of PLEs, and the Penn Emotion Recognition-40 Test (Penn ER-40), a measure of facial emotion recognition, were completed by 793 children aged 10-13 years. Children who reported PLEs performed significantly more poorly on FER (β=-0.03, p=0.035). Recognition of sad faces was the major driver of effects, with children performing particularly poorly when identifying this expression (β=-0.08, p=0.032). The current findings show that PLEs are associated with poorer FER. Further work is needed to elucidate causal relationships with implications for the design of future interventions for those at risk of developing psychosis.
Visser-Keizer, Annemarie C.; Westerhof-Evers, Herma J.; Gerritsen, Marleen J. J.; van der Naalt, Joukje; Spikman, Jacoba M.
2016-01-01
Fear is an important emotional reaction that guides decision making in situations of ambiguity or uncertainty. Both recognition of facial expressions of fear and decision making ability can be impaired after traumatic brain injury (TBI), in particular when the frontal lobe is damaged. So far, it has not been investigated how recognition of fear influences risk behavior in healthy subjects and TBI patients. The ability to recognize fear is thought to be related to the ability to experience fear and to use it as a warning signal to guide decision making. We hypothesized that a better ability to recognize fear would be related to a better regulation of risk behavior, with healthy controls outperforming TBI patients. To investigate this, 59 healthy subjects and 49 TBI patients were assessed with a test for emotion recognition (Facial Expression of Emotion: Stimuli and Tests) and a gambling task (Iowa Gambling Task (IGT)). The results showed that, regardless of post traumatic amnesia duration or the presence of frontal lesions, patients were more impaired than healthy controls on both fear recognition and decision making. In both groups, a significant relationship was found between better fear recognition, the development of an advantageous strategy across the IGT and less risk behavior in the last blocks of the IGT. Educational level moderated this relationship in the final block of the IGT. This study has important clinical implications, indicating that impaired decision making and risk behavior after TBI can be preceded by deficits in the processing of fear. PMID:27870900
Visser-Keizer, Annemarie C; Westerhof-Evers, Herma J; Gerritsen, Marleen J J; van der Naalt, Joukje; Spikman, Jacoba M
2016-01-01
Fear is an important emotional reaction that guides decision making in situations of ambiguity or uncertainty. Both recognition of facial expressions of fear and decision making ability can be impaired after traumatic brain injury (TBI), in particular when the frontal lobe is damaged. So far, it has not been investigated how recognition of fear influences risk behavior in healthy subjects and TBI patients. The ability to recognize fear is thought to be related to the ability to experience fear and to use it as a warning signal to guide decision making. We hypothesized that a better ability to recognize fear would be related to a better regulation of risk behavior, with healthy controls outperforming TBI patients. To investigate this, 59 healthy subjects and 49 TBI patients were assessed with a test for emotion recognition (Facial Expression of Emotion: Stimuli and Tests) and a gambling task (Iowa Gambling Task (IGT)). The results showed that, regardless of post traumatic amnesia duration or the presence of frontal lesions, patients were more impaired than healthy controls on both fear recognition and decision making. In both groups, a significant relationship was found between better fear recognition, the development of an advantageous strategy across the IGT and less risk behavior in the last blocks of the IGT. Educational level moderated this relationship in the final block of the IGT. This study has important clinical implications, indicating that impaired decision making and risk behavior after TBI can be preceded by deficits in the processing of fear.
Connolly, Hannah L; Lefevre, Carmen E; Young, Andrew W; Lewis, Gary J
2018-05-21
Although it is widely believed that females outperform males in the ability to recognize other people's emotions, this conclusion is not well supported by the extant literature. The current study sought to provide a strong test of the female superiority hypothesis by investigating sex differences in emotion recognition for five basic emotions using stimuli well-calibrated for individual differences assessment, across two expressive domains (face and body), and in a large sample (N = 1,022: Study 1). We also assessed the stability and generalizability of our findings with two independent replication samples (N = 303: Study 2, N = 634: Study 3). In Study 1, we observed that females were superior to males in recognizing facial disgust and sadness. In contrast, males were superior to females in recognizing bodily happiness. The female superiority for recognition of facial disgust was replicated in Studies 2 and 3, and this observation also extended to an independent stimulus set in Study 2. No other sex differences were stable across studies. These findings provide evidence for the presence of sex differences in emotion recognition ability, but show that these differences are modest in magnitude and appear to be limited to facial disgust. We discuss whether this sex difference may reflect human evolutionary imperatives concerning reproductive fitness and child care. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Major depressive disorder skews the recognition of emotional prosody.
Péron, Julie; El Tamer, Sarah; Grandjean, Didier; Leray, Emmanuelle; Travers, David; Drapier, Dominique; Vérin, Marc; Millet, Bruno
2011-06-01
Major depressive disorder (MDD) is associated with abnormalities in the recognition of emotional stimuli. MDD patients ascribe more negative emotion but also less positive emotion to facial expressions, suggesting blunted responsiveness to positive emotional stimuli. To ascertain whether these emotional biases are modality-specific, we examined the effects of MDD on the recognition of emotions from voices using a paradigm designed to capture subtle effects of biases. Twenty-one MDD patients and 21 healthy controls (HC) underwent clinical and neuropsychological assessments, followed by a paradigm featuring pseudowords spoken by actors in five types of emotional prosody, rated on continuous scales. Overall, MDD patients performed more poorly than HC, displaying significantly impaired recognition of fear, happiness and sadness. Compared with HC, they rated fear significantly more highly when listening to anger stimuli. They also displayed a bias toward surprise, rating it far higher when they heard sad or fearful utterances. Furthermore, for happiness stimuli, MDD patients gave higher ratings for negative emotions (fear and sadness). A multiple regression model on recognition of emotional prosody in MDD patients showed that the best fit was achieved using the executive functioning (categorical fluency, number of errors in the MCST, and TMT B-A) and the total score of the Montgomery-Asberg Depression Rating Scale. Impaired recognition of emotions would appear not to be specific to the visual modality but to be present also when emotions are expressed vocally, this impairment being related to depression severity and dysexecutive syndrome. MDD seems to skew the recognition of emotional prosody toward negative emotional stimuli and the blunting of positive emotion appears not to be restricted to the visual modality. Copyright © 2011 Elsevier Inc. All rights reserved.
Anomalous subjective experience and psychosis risk in young depressed patients.
Szily, Erika; Kéri, Szabolcs
2009-01-01
Help-seeking young people often display depressive symptoms. In some patients, these symptoms may co-exist with clinically high-risk mental states for psychosis. The aim of this study was to determine differences in subjective experience and social perception in young depressed patients with and without psychosis risk. Participants were 68 young persons with major depressive disorder. Twenty-six patients also met the criteria of attenuated or brief limited intermittent psychotic symptoms according to the Comprehensive Assessment of At Risk Mental States (CAARMS) criteria. Subjective experiences were assessed with the Bonn Scale for the Assessment of Basic Symptoms (BSABS). Recognition of complex social emotions and mental states was assessed using the 'Reading the Mind in the Eyes' test. Perplexity, self-disorder, and diminished affectivity significantly predicted psychosis risk. Depressed patients without psychosis risk displayed impaired recognition performance for negative social emotions, whereas patients with psychosis risk were also impaired in the recognition of cognitive expressions. In the high-risk group, self-disorder was associated with impaired recognition of facial expressions. These results suggest that anomalous subjective experience and impaired recognition of complex emotions may differentiate between young depressed patients with and without psychosis risk. 2009 S. Karger AG, Basel.
Gold, Rinat; Butler, Pamela; Revheim, Nadine; Leitman, David; Hansen, John A.; Gur, Ruben; Kantrowitz, Joshua T.; Laukka, Petri; Juslin, Patrik N.; Silipo, Gail S.; Javitt, Daniel C.
2013-01-01
Objective Schizophrenia is associated with deficits in ability to perceive emotion based upon tone of voice. The basis for this deficit, however, remains unclear and assessment batteries remain limited. We evaluated performance in schizophrenia on a novel voice emotion recognition battery with well characterized physical features, relative to impairments in more general emotional and cognitive function. Methods We studied in a primary sample of 92 patients relative to 73 controls. Stimuli were characterized according to both intended emotion and physical features (e.g., pitch, intensity) that contributed to the emotional percept. Parallel measures of visual emotion recognition, pitch perception, general cognition, and overall outcome were obtained. More limited measures were obtained in an independent replication sample of 36 patients, 31 age-matched controls, and 188 general comparison subjects. Results Patients showed significant, large effect size deficits in voice emotion recognition (F=25.4, p<.00001, d=1.1), and were preferentially impaired in recognition of emotion based upon pitch-, but not intensity-features (group X feature interaction: F=7.79, p=.006). Emotion recognition deficits were significantly correlated with pitch perception impairments both across (r=56, p<.0001) and within (r=.47, p<.0001) group. Path analysis showed both sensory-specific and general cognitive contributions to auditory emotion recognition deficits in schizophrenia. Similar patterns of results were observed in the replication sample. Conclusions The present study demonstrates impairments in auditory emotion recognition in schizophrenia relative to acoustic features of underlying stimuli. Furthermore, it provides tools and highlights the need for greater attention to physical features of stimuli used for study of social cognition in neuropsychiatric disorders. PMID:22362394
Parra, Mario A; Pattan, Vivek; Wong, Dichelle; Beaglehole, Anna; Lonie, Jane; Wan, Hong I; Honey, Garry; Hall, Jeremy; Whalley, Heather C; Lawrie, Stephen M
2013-03-06
Relative to intentional memory encoding, which quickly declines in Mild Cognitive Impairment (MCI) and Alzheimer's disease (AD), incidental memory for emotional stimuli appears to deteriorate more slowly. We hypothesised that tests of incidental emotional memory may inform on different aspects of cognitive decline in MCI and AD. Patients with MCI, AD and Healthy Controls (HC) were asked to attend to emotional pictures (i.e., positive and neutral) sequentially presented during an fMRI session. Attention was monitored behaviourally. A surprise post-scan recognition test was then administered. The groups remained attentive within the scanner. The post-scan recognition pattern was in the form of (HC = MCI) > AD, with only the former group showing a clear benefit from emotional pictures. fMRI analysis of incidental encoding demonstrated clusters of activation in para-hippocampal regions and in the hippocampus in HC and MCI patients but not in AD patients. The pattern of activation observed in MCI patients tended to be greater than that found in HC. The results suggest that incidental emotional memory might offer a suitable platform to investigate, using behavioural and fMRI measures, subtle changes in the process of developing AD. These changes seem to differ from those found using standard episodic memory tests. The underpinnings of such differences and the potential clinical use of this methodology are discussed in depth.
MDMA enhances emotional empathy and prosocial behavior.
Hysek, Cédric M; Schmid, Yasmin; Simmler, Linda D; Domes, Gregor; Heinrichs, Markus; Eisenegger, Christoph; Preller, Katrin H; Quednow, Boris B; Liechti, Matthias E
2014-11-01
3,4-Methylenedioxymethamphetamine (MDMA, 'ecstasy') releases serotonin and norepinephrine. MDMA is reported to produce empathogenic and prosocial feelings. It is unknown whether MDMA in fact alters empathic concern and prosocial behavior. We investigated the acute effects of MDMA using the Multifaceted Empathy Test (MET), dynamic Face Emotion Recognition Task (FERT) and Social Value Orientation (SVO) test. We also assessed effects of MDMA on plasma levels of hormones involved in social behavior using a placebo-controlled, double-blind, random-order, cross-over design in 32 healthy volunteers (16 women). MDMA enhanced explicit and implicit emotional empathy in the MET and increased prosocial behavior in the SVO test in men. MDMA did not alter cognitive empathy in the MET but impaired the identification of negative emotions, including fearful, angry and sad faces, in the FERT, particularly in women. MDMA increased plasma levels of cortisol and prolactin, which are markers of serotonergic and noradrenergic activity, and of oxytocin, which has been associated with prosocial behavior. In summary, MDMA sex-specifically altered the recognition of emotions, emotional empathy and prosociality. These effects likely enhance sociability when MDMA is used recreationally and may be useful when MDMA is administered in conjunction with psychotherapy in patients with social dysfunction or post-traumatic stress disorder. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Emotion recognition from EEG using higher order crossings.
Petrantonakis, Panagiotis C; Hadjileontiadis, Leontios J
2010-03-01
Electroencephalogram (EEG)-based emotion recognition is a relatively new field in the affective computing area with challenging issues regarding the induction of the emotional states and the extraction of the features in order to achieve optimum classification performance. In this paper, a novel emotion evocation and EEG-based feature extraction technique is presented. In particular, the mirror neuron system concept was adapted to efficiently foster emotion induction by the process of imitation. In addition, higher order crossings (HOC) analysis was employed for the feature extraction scheme and a robust classification method, namely HOC-emotion classifier (HOC-EC), was implemented testing four different classifiers [quadratic discriminant analysis (QDA), k-nearest neighbor, Mahalanobis distance, and support vector machines (SVMs)], in order to accomplish efficient emotion recognition. Through a series of facial expression image projection, EEG data have been collected by 16 healthy subjects using only 3 EEG channels, namely Fp1, Fp2, and a bipolar channel of F3 and F4 positions according to 10-20 system. Two scenarios were examined using EEG data from a single-channel and from combined-channels, respectively. Compared with other feature extraction methods, HOC-EC appears to outperform them, achieving a 62.3% (using QDA) and 83.33% (using SVM) classification accuracy for the single-channel and combined-channel cases, respectively, differentiating among the six basic emotions, i.e., happiness, surprise, anger, fear, disgust, and sadness. As the emotion class-set reduces its dimension, the HOC-EC converges toward maximum classification rate (100% for five or less emotions), justifying the efficiency of the proposed approach. This could facilitate the integration of HOC-EC in human machine interfaces, such as pervasive healthcare systems, enhancing their affective character and providing information about the user's emotional status (e.g., identifying user's emotion experiences, recurring affective states, time-dependent emotional trends).
Facial recognition deficits as a potential endophenotype in bipolar disorder.
Vierck, Esther; Porter, Richard J; Joyce, Peter R
2015-11-30
Bipolar disorder (BD) is considered a highly heritable and genetically complex disorder. Several cognitive functions, such as executive functions and verbal memory have been suggested as promising candidates for endophenotypes. Although there is evidence for deficits in facial emotion recognition in individuals with BD, studies investigating these functions as endophenotypes are rare. The current study investigates emotion recognition as a potential endophenotype in BD by comparing 36 BD participants, 24 of their 1st degree relatives and 40 healthy control participants in a computerised facial emotion recognition task. Group differences were evaluated using repeated measurement analysis of co-variance with age as a covariate. Results revealed slowed emotion recognition for both BD and their relatives. Furthermore, BD participants were less accurate than healthy controls in their recognition of emotion expressions. We found no evidence of emotion specific differences between groups. Our results provide evidence for facial recognition as a potential endophenotype in BD. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Clark, Uraina S.; Walker, Keenan A.; Cohen, Ronald A.; Devlin, Kathryn N.; Folkers, Anna M.; Pina, Mathew M.; Tashima, Karen T.
2015-01-01
Impaired facial emotion recognition abilities in HIV+ patients are well documented, but little is known about the neural etiology of these difficulties. We examined the relation of facial emotion recognition abilities to regional brain volumes in 44 HIV-positive (HIV+) and 44 HIV-negative control (HC) adults. Volumes of structures implicated in HIV− associated neuropathology and emotion recognition were measured on MRI using an automated segmentation tool. Relative to HC, HIV+ patients demonstrated emotion recognition impairments for fearful expressions, reduced anterior cingulate cortex (ACC) volumes, and increased amygdala volumes. In the HIV+ group, fear recognition impairments correlated significantly with ACC, but not amygdala volumes. ACC reductions were also associated with lower nadir CD4 levels (i.e., greater HIV-disease severity). These findings extend our understanding of the neurobiological substrates underlying an essential social function, facial emotion recognition, in HIV+ individuals and implicate HIV-related ACC atrophy in the impairment of these abilities. PMID:25744868
Utterance independent bimodal emotion recognition in spontaneous communication
NASA Astrophysics Data System (ADS)
Tao, Jianhua; Pan, Shifeng; Yang, Minghao; Li, Ya; Mu, Kaihui; Che, Jianfeng
2011-12-01
Emotion expressions sometimes are mixed with the utterance expression in spontaneous face-to-face communication, which makes difficulties for emotion recognition. This article introduces the methods of reducing the utterance influences in visual parameters for the audio-visual-based emotion recognition. The audio and visual channels are first combined under a Multistream Hidden Markov Model (MHMM). Then, the utterance reduction is finished by finding the residual between the real visual parameters and the outputs of the utterance related visual parameters. This article introduces the Fused Hidden Markov Model Inversion method which is trained in the neutral expressed audio-visual corpus to solve the problem. To reduce the computing complexity the inversion model is further simplified to a Gaussian Mixture Model (GMM) mapping. Compared with traditional bimodal emotion recognition methods (e.g., SVM, CART, Boosting), the utterance reduction method can give better results of emotion recognition. The experiments also show the effectiveness of our emotion recognition system when it was used in a live environment.
Emotional Valence and Arousal Effects on Memory and Hemispheric Asymmetries
ERIC Educational Resources Information Center
Mneimne, Malek; Powers, Alice S.; Walton, Kate E.; Kosson, David S.; Fonda, Samantha; Simonetti, Jessica
2010-01-01
This study examined predictions based upon the right hemisphere (RH) model, the valence-arousal model, and a recently proposed integrated model (Killgore & Yurgelun-Todd, 2007) of emotion processing by testing immediate recall and recognition memory for positive, negative, and neutral verbal stimuli among 35 right-handed women. Building upon…
Development of Perceptual Expertise in Emotion Recognition
ERIC Educational Resources Information Center
Pollak, Seth D.; Messner, Michael; Kistler, Doris J.; Cohn, Jeffrey F.
2009-01-01
How do children's early social experiences influence their perception of emotion-specific information communicated by the face? To examine this question, we tested a group of abused children who had been exposed to extremely high levels of parental anger expression and physical threat. Children were presented with arrays of stimuli that depicted…
ERIC Educational Resources Information Center
Rojahn, Johannes; And Others
1995-01-01
This literature review discusses 21 studies on facial emotion recognition by persons with mental retardation in terms of methodological characteristics, stimulus material, salient variables and their relation to recognition tasks, and emotion recognition deficits in mental retardation. A table provides comparative data on all 21 studies. (DB)
Facial Emotion Recognition in Bipolar Disorder and Healthy Aging.
Altamura, Mario; Padalino, Flavia A; Stella, Eleonora; Balzotti, Angela; Bellomo, Antonello; Palumbo, Rocco; Di Domenico, Alberto; Mammarella, Nicola; Fairfield, Beth
2016-03-01
Emotional face recognition is impaired in bipolar disorder, but it is not clear whether this is specific for the illness. Here, we investigated how aging and bipolar disorder influence dynamic emotional face recognition. Twenty older adults, 16 bipolar patients, and 20 control subjects performed a dynamic affective facial recognition task and a subsequent rating task. Participants pressed a key as soon as they were able to discriminate whether the neutral face was assuming a happy or angry facial expression and then rated the intensity of each facial expression. Results showed that older adults recognized happy expressions faster, whereas bipolar patients recognized angry expressions faster. Furthermore, both groups rated emotional faces more intensely than did the control subjects. This study is one of the first to compare how aging and clinical conditions influence emotional facial recognition and underlines the need to consider the role of specific and common factors in emotional face recognition.
The involvement of emotion recognition in affective theory of mind.
Mier, Daniela; Lis, Stefanie; Neuthe, Kerstin; Sauer, Carina; Esslinger, Christine; Gallhofer, Bernd; Kirsch, Peter
2010-11-01
This study was conducted to explore the relationship between emotion recognition and affective Theory of Mind (ToM). Forty subjects performed a facial emotion recognition and an emotional intention recognition task (affective ToM) in an event-related fMRI study. Conjunction analysis revealed overlapping activation during both tasks. Activation in some of these conjunctly activated regions was even stronger during affective ToM than during emotion recognition, namely in the inferior frontal gyrus, the superior temporal sulcus, the temporal pole, and the amygdala. In contrast to previous studies investigating ToM, we found no activation in the anterior cingulate, commonly assumed as the key region for ToM. The results point to a close relationship of emotion recognition and affective ToM and can be interpreted as evidence for the assumption that at least basal forms of ToM occur by an embodied, non-cognitive process. Copyright © 2010 Society for Psychophysiological Research.
Impaired Emotion Recognition in Music in Parkinson's Disease
ERIC Educational Resources Information Center
van Tricht, Mirjam J.; Smeding, Harriet M. M.; Speelman, Johannes D.; Schmand, Ben A.
2010-01-01
Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson's disease (PD) and 20 matched healthy volunteers. The role of cognitive dysfunction…
Emotion Recognition and Visual-Scan Paths in Fragile X Syndrome
ERIC Educational Resources Information Center
Shaw, Tracey A.; Porter, Melanie A.
2013-01-01
This study investigated emotion recognition abilities and visual scanning of emotional faces in 16 Fragile X syndrome (FXS) individuals compared to 16 chronological-age and 16 mental-age matched controls. The relationships between emotion recognition, visual scan-paths and symptoms of social anxiety, schizotypy and autism were also explored.…
Cognitive, emotional and social markers of serial murdering.
Angrilli, Alessandro; Sartori, Giuseppe; Donzella, Giovanna
2013-01-01
Although criminal psychopathy is starting to be relatively well described, our knowledge of the characteristics and scientific markers of serial murdering is still very poor. A serial killer who murdered more than five people, KT, was administered a battery of standardized tests aimed at measuring neuropsychological impairment and social/emotional cognition deficits. KT exhibited a striking dissociation between a high level of emotional detachment and a low score on the antisocial behavior scale on the Psychopathy Checklist-Revised (PCL-R). The Minnesota Multiphasic Personality Inventory-2 showed a normal pattern with the psychotic triad at borderline level. KT had a high intelligence score and showed almost no impairment in cognitive tests sensitive to frontal lobe dysfunction (Wisconsin Card Sorting Test, Theory of Mind, Tower of London, this latter evidenced a mild impairment in planning performance). In the tests on moral, emotional and social cognition, his patterns of response differed from matched controls and from past reports on criminal psychopaths as, unlike these individuals, KT exhibited normal recognition of fear and a relatively intact knowledge of moral rules but he was impaired in the recognition of anger, embarrassment and conventional social rules. The overall picture of KT suggests that serial killing may be closer to normality than psychopathy defined according to either the DSM IV or the PCL-R, and it would be characterized by a relatively spared moral cognition and selective deficits in social and emotional cognition domains.
Montgomery, Charlotte B; Allison, Carrie; Lai, Meng-Chuan; Cassidy, Sarah; Langdon, Peter E; Baron-Cohen, Simon
2016-06-01
The present study examined whether adults with high functioning autism (HFA) showed greater difficulties in (1) their self-reported ability to empathise with others and/or (2) their ability to read mental states in others' eyes than adults with Asperger syndrome (AS). The Empathy Quotient (EQ) and 'Reading the Mind in the Eyes' Test (Eyes Test) were compared in 43 adults with AS and 43 adults with HFA. No significant difference was observed on EQ score between groups, while adults with AS performed significantly better on the Eyes Test than those with HFA. This suggests that adults with HFA may need more support, particularly in mentalizing and complex emotion recognition, and raises questions about the existence of subgroups within autism spectrum conditions.
Elfenbein, Hillary Anger; Jang, Daisung; Sharma, Sudeep; Sanchez-Burks, Jeffrey
2017-03-01
Emotional intelligence (EI) has captivated researchers and the public alike, but it has been challenging to establish its components as objective abilities. Self-report scales lack divergent validity from personality traits, and few ability tests have objectively correct answers. We adapt the Stroop task to introduce a new facet of EI called emotional attention regulation (EAR), which involves focusing emotion-related attention for the sake of information processing rather than for the sake of regulating one's own internal state. EAR includes 2 distinct components. First, tuning in to nonverbal cues involves identifying nonverbal cues while ignoring alternate content, that is, emotion recognition under conditions of distraction by competing stimuli. Second, tuning out of nonverbal cues involves ignoring nonverbal cues while identifying alternate content, that is, the ability to interrupt emotion recognition when needed to focus attention elsewhere. An auditory test of valence included positive and negative words spoken in positive and negative vocal tones. A visual test of approach-avoidance included green- and red-colored facial expressions depicting happiness and anger. The error rates for incongruent trials met the key criteria for establishing the validity of an EI test, in that the measure demonstrated test-retest reliability, convergent validity with other EI measures, divergent validity from factors such as general processing speed and mostly personality, and predictive validity in this case for well-being. By demonstrating that facets of EI can be validly theorized and empirically assessed, results also speak to the validity of EI more generally. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue
2009-06-15
Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.
Processing of Facial Emotion in Bipolar Depression and Euthymia.
Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter
2015-10-01
Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.
Family environment influences emotion recognition following paediatric traumatic brain injury
SCHMIDT, ADAM T.; ORSTEN, KIMBERLEY D.; HANTEN, GERRI R.; LI, XIAOQI; LEVIN, HARVEY S.
2011-01-01
Objective This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). Methods A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Conclusions Findings suggest family functioning variables—especially financial resources—can influence performance on an emotional processing task following TBI in children. PMID:21058900
Affective responsiveness is influenced by intake of oral contraceptives.
Radke, Sina; Derntl, Birgit
2016-06-01
Despite the widespread use of oral contraceptive pills (OCs), little is known about their impact on psychological processes and emotional competencies. Recent data indicate impaired emotion recognition in OC users compared to naturally cycling females. Building upon these findings, the current study investigated the influence of OC use on three components of empathy, i.e., emotion recognition, perspective-taking, and affective responsiveness. We compared naturally cycling women to two groups of OC users, one being tested in their pill-free week and one in the phase of active intake. Whereas groups did not differ in emotion recognition and perspective-taking, an effect of pill phase was evident for affective responsiveness: Females currently taking the pill showed better performance than those in their pill-free week. These processing advantages complement previous findings on menstrual cycle effects and thereby suggest an association with changes in endogenous and exogenous reproductive hormones. The current study highlights the need for future research to shed more light on the neuroendocrine alterations accompanying OC intake. Copyright © 2016 Elsevier B.V. and ECNP. All rights reserved.
Herniman, Sarah E; Allott, Kelly A; Killackey, Eóin; Hester, Robert; Cotton, Sue M
2017-01-15
Comorbid depression is common in first-episode schizophrenia spectrum (FES) disorders. Both depression and FES are associated with significant deficits in facial and prosody emotion recognition performance. However, it remains unclear whether people with FES and comorbid depression, compared to those without comorbid depression, have overall poorer emotion recognition, or instead, a different pattern of emotion recognition deficits. The aim of this study was to compare facial and prosody emotion recognition performance between those with and without comorbid depression in FES. This study involved secondary analysis of baseline data from a randomized controlled trial of vocational intervention for young people with first-episode psychosis (N=82; age range: 15-25 years). Those with comorbid depression (n=24) had more accurate recognition of sadness in faces compared to those without comorbid depression. Severity of depressive symptoms was also associated with more accurate recognition of sadness in faces. Such results did not recur for prosody emotion recognition. In addition to the cross-sectional design, limitations of this study include the absence of facial and prosodic recognition of neutral emotions. Findings indicate a mood congruent negative bias in facial emotion recognition in those with comorbid depression and FES, and provide support for cognitive theories of depression that emphasise the role of such biases in the development and maintenance of depression. Longitudinal research is needed to determine whether mood-congruent negative biases are implicated in the development and maintenance of depression in FES, or whether such biases are simply markers of depressed state. Copyright © 2016 Elsevier B.V. All rights reserved.
Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone
2014-12-01
Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.
Baez, Sandra; Marengo, Juan; Perez, Ana; Huepe, David; Font, Fernanda Giralt; Rial, Veronica; Gonzalez-Gadea, María Luz; Manes, Facundo; Ibanez, Agustin
2015-09-01
Impaired social cognition has been claimed to be a mechanism underlying the development and maintenance of borderline personality disorder (BPD). One important aspect of social cognition is the theory of mind (ToM), a complex skill that seems to be influenced by more basic processes, such as executive functions (EF) and emotion recognition. Previous ToM studies in BPD have yielded inconsistent results. This study assessed the performance of BPD adults on ToM, emotion recognition, and EF tasks. We also examined whether EF and emotion recognition could predict the performance on ToM tasks. We evaluated 15 adults with BPD and 15 matched healthy controls using different tasks of EF, emotion recognition, and ToM. The results showed that BPD adults exhibited deficits in the three domains, which seem to be task-dependent. Furthermore, we found that EF and emotion recognition predicted the performance on ToM. Our results suggest that tasks that involve real-life social scenarios and contextual cues are more sensitive to detect ToM and emotion recognition deficits in BPD individuals. Our findings also indicate that (a) ToM variability in BPD is partially explained by individual differences on EF and emotion recognition; and (b) ToM deficits of BPD patients are partially explained by the capacity to integrate cues from face, prosody, gesture, and social context to identify the emotions and others' beliefs. © 2014 The British Psychological Society.
Caballero-Morales, Santiago-Omar
2013-01-01
An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness). Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR's output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech. PMID:23935410
Corcoran, C M; Keilp, J G; Kayser, J; Klim, C; Butler, P D; Bruder, G E; Gur, R C; Javitt, D C
2015-10-01
Schizophrenia is characterized by profound and disabling deficits in the ability to recognize emotion in facial expression and tone of voice. Although these deficits are well documented in established schizophrenia using recently validated tasks, their predictive utility in at-risk populations has not been formally evaluated. The Penn Emotion Recognition and Discrimination tasks, and recently developed measures of auditory emotion recognition, were administered to 49 clinical high-risk subjects prospectively followed for 2 years for schizophrenia outcome, and 31 healthy controls, and a developmental cohort of 43 individuals aged 7-26 years. Deficit in emotion recognition in at-risk subjects was compared with deficit in established schizophrenia, and with normal neurocognitive growth curves from childhood to early adulthood. Deficits in emotion recognition significantly distinguished at-risk patients who transitioned to schizophrenia. By contrast, more general neurocognitive measures, such as attention vigilance or processing speed, were non-predictive. The best classification model for schizophrenia onset included both face emotion processing and negative symptoms, with accuracy of 96%, and area under the receiver-operating characteristic curve of 0.99. In a parallel developmental study, emotion recognition abilities were found to reach maturity prior to traditional age of risk for schizophrenia, suggesting they may serve as objective markers of early developmental insult. Profound deficits in emotion recognition exist in at-risk patients prior to schizophrenia onset. They may serve as an index of early developmental insult, and represent an effective target for early identification and remediation. Future studies investigating emotion recognition deficits at both mechanistic and predictive levels are strongly encouraged.
Horton, Leslie E; Bridgwater, Miranda A; Haas, Gretchen L
2017-05-01
Emotion recognition, a social cognition domain, is impaired in people with schizophrenia and contributes to social dysfunction. Whether impaired emotion recognition emerges as a manifestation of illness or predates symptoms is unclear. Findings from studies of emotion recognition impairments in first-degree relatives of people with schizophrenia are mixed and, to our knowledge, no studies have investigated the link between emotion recognition and social functioning in that population. This study examined facial affect recognition and social skills in 16 offspring of parents with schizophrenia (familial high-risk/FHR) compared to 34 age- and sex-matched healthy controls (HC), ages 7-19. As hypothesised, FHR children exhibited impaired overall accuracy, accuracy in identifying fearful faces, and overall recognition speed relative to controls. Age-adjusted facial affect recognition accuracy scores predicted parent's overall rating of their child's social skills for both groups. This study supports the presence of facial affect recognition deficits in FHR children. Importantly, as the first known study to suggest the presence of these deficits in young, asymptomatic FHR children, it extends findings to a developmental stage predating symptoms. Further, findings point to a relationship between early emotion recognition and social skills. Improved characterisation of deficits in FHR children could inform early intervention.
ERIC Educational Resources Information Center
Evers, Kris; Steyaert, Jean; Noens, Ilse; Wagemans, Johan
2015-01-01
Emotion labelling was evaluated in two matched samples of 6-14-year old children with and without an autism spectrum disorder (ASD; N = 45 and N = 50, resp.), using six dynamic facial expressions. The Emotion Recognition Task proved to be valuable demonstrating subtle emotion recognition difficulties in ASD, as we showed a general poorer emotion…
Omar, Rohani; Henley, Susie M.D.; Bartlett, Jonathan W.; Hailstone, Julia C.; Gordon, Elizabeth; Sauter, Disa A.; Frost, Chris; Scott, Sophie K.; Warren, Jason D.
2011-01-01
Despite growing clinical and neurobiological interest in the brain mechanisms that process emotion in music, these mechanisms remain incompletely understood. Patients with frontotemporal lobar degeneration (FTLD) frequently exhibit clinical syndromes that illustrate the effects of breakdown in emotional and social functioning. Here we investigated the neuroanatomical substrate for recognition of musical emotion in a cohort of 26 patients with FTLD (16 with behavioural variant frontotemporal dementia, bvFTD, 10 with semantic dementia, SemD) using voxel-based morphometry. On neuropsychological evaluation, patients with FTLD showed deficient recognition of canonical emotions (happiness, sadness, anger and fear) from music as well as faces and voices compared with healthy control subjects. Impaired recognition of emotions from music was specifically associated with grey matter loss in a distributed cerebral network including insula, orbitofrontal cortex, anterior cingulate and medial prefrontal cortex, anterior temporal and more posterior temporal and parietal cortices, amygdala and the subcortical mesolimbic system. This network constitutes an essential brain substrate for recognition of musical emotion that overlaps with brain regions previously implicated in coding emotional value, behavioural context, conceptual knowledge and theory of mind. Musical emotion recognition may probe the interface of these processes, delineating a profile of brain damage that is essential for the abstraction of complex social emotions. PMID:21385617
Omar, Rohani; Henley, Susie M D; Bartlett, Jonathan W; Hailstone, Julia C; Gordon, Elizabeth; Sauter, Disa A; Frost, Chris; Scott, Sophie K; Warren, Jason D
2011-06-01
Despite growing clinical and neurobiological interest in the brain mechanisms that process emotion in music, these mechanisms remain incompletely understood. Patients with frontotemporal lobar degeneration (FTLD) frequently exhibit clinical syndromes that illustrate the effects of breakdown in emotional and social functioning. Here we investigated the neuroanatomical substrate for recognition of musical emotion in a cohort of 26 patients with FTLD (16 with behavioural variant frontotemporal dementia, bvFTD, 10 with semantic dementia, SemD) using voxel-based morphometry. On neuropsychological evaluation, patients with FTLD showed deficient recognition of canonical emotions (happiness, sadness, anger and fear) from music as well as faces and voices compared with healthy control subjects. Impaired recognition of emotions from music was specifically associated with grey matter loss in a distributed cerebral network including insula, orbitofrontal cortex, anterior cingulate and medial prefrontal cortex, anterior temporal and more posterior temporal and parietal cortices, amygdala and the subcortical mesolimbic system. This network constitutes an essential brain substrate for recognition of musical emotion that overlaps with brain regions previously implicated in coding emotional value, behavioural context, conceptual knowledge and theory of mind. Musical emotion recognition may probe the interface of these processes, delineating a profile of brain damage that is essential for the abstraction of complex social emotions. Copyright © 2011 Elsevier Inc. All rights reserved.
Saive, Anne-Lise; Royet, Jean-Pierre; Ravel, Nadine; Thévenet, Marc; Garcia, Samuel; Plailly, Jane
2014-01-01
We behaviorally explore the link between olfaction, emotion and memory by testing the hypothesis that the emotion carried by odors facilitates the memory of specific unique events. To investigate this idea, we used a novel behavioral approach inspired by a paradigm developed by our team to study episodic memory in a controlled and as ecological as possible way in humans. The participants freely explored three unique and rich laboratory episodes; each episode consisted of three unfamiliar odors (What) positioned at three specific locations (Where) within a visual context (Which context). During the retrieval test, which occurred 24–72 h after the encoding, odors were used to trigger the retrieval of the complex episodes. The participants were proficient in recognizing the target odors among distractors and retrieving the visuospatial context in which they were encountered. The episodic nature of the task generated high and stable memory performances, which were accompanied by faster responses and slower and deeper breathing. Successful odor recognition and episodic memory were not related to differences in odor investigation at encoding. However, memory performances were influenced by the emotional content of the odors, regardless of odor valence, with both pleasant and unpleasant odors generating higher recognition and episodic retrieval than neutral odors. Finally, the present study also suggested that when the binding between the odors and the spatio-contextual features of the episode was successful, the odor recognition and the episodic retrieval collapsed into a unique memory process that began as soon as the participants smelled the odors. PMID:24936176
ERIC Educational Resources Information Center
Golan, Ofer; Gordon, Ilanit; Fichman, Keren; Keinan, Giora
2018-01-01
Children with ASD show emotion recognition difficulties, as part of their social communication deficits. We examined facial emotion recognition (FER) in intellectually disabled children with ASD and in younger typically developing (TD) controls, matched on mental age. Our emotion-matching paradigm employed three different modalities: facial, vocal…
ERIC Educational Resources Information Center
Schmidt, Adam T.; Hanten, Gerri R.; Li, Xiaoqi; Orsten, Kimberley D.; Levin, Harvey S.
2010-01-01
Children with closed head injuries often experience significant and persistent disruptions in their social and behavioral functioning. Studies with adults sustaining a traumatic brain injury (TBI) indicate deficits in emotion recognition and suggest that these difficulties may underlie some of the social deficits. The goal of the current study was…
Martínez-Castilla, Pastora; Burt, Michael; Borgatti, Renato; Gagliardi, Chiara
2015-01-01
In this study both the matching and developmental trajectories approaches were used to clarify questions that remain open in the literature on facial emotion recognition in Williams syndrome (WS) and Down syndrome (DS). The matching approach showed that individuals with WS or DS exhibit neither proficiency for the expression of happiness nor specific impairments for negative emotions. Instead, they present the same pattern of emotion recognition as typically developing (TD) individuals. Thus, the better performance on the recognition of positive compared to negative emotions usually reported in WS and DS is not specific of these populations but seems to represent a typical pattern. Prior studies based on the matching approach suggested that the development of facial emotion recognition is delayed in WS and atypical in DS. Nevertheless, and even though performance levels were lower in DS than in WS, the developmental trajectories approach used in this study evidenced that not only individuals with DS but also those with WS present atypical development in facial emotion recognition. Unlike in the TD participants, where developmental changes were observed along with age, in the WS and DS groups, the development of facial emotion recognition was static. Both individuals with WS and those with DS reached an early maximum developmental level due to cognitive constraints.
Attentional biases and memory for emotional stimuli in men and male rhesus monkeys.
Lacreuse, Agnès; Schatz, Kelly; Strazzullo, Sarah; King, Hanna M; Ready, Rebecca
2013-11-01
We examined attentional biases for social and non-social emotional stimuli in young adult men and compared the results to those of male rhesus monkeys (Macaca mulatta) previously tested in a similar dot-probe task (King et al. in Psychoneuroendocrinology 37(3):396-409, 2012). Recognition memory for these stimuli was also analyzed in each species, using a recognition memory task in humans and a delayed non-matching-to-sample task in monkeys. We found that both humans and monkeys displayed a similar pattern of attentional biases toward threatening facial expressions of conspecifics. The bias was significant in monkeys and of marginal significance in humans. In addition, humans, but not monkeys, exhibited an attentional bias away from negative non-social images. Attentional biases for social and non-social threat differed significantly, with both species showing a pattern of vigilance toward negative social images and avoidance of negative non-social images. Positive stimuli did not elicit significant attentional biases for either species. In humans, emotional content facilitated the recognition of non-social images, but no effect of emotion was found for the recognition of social images. Recognition accuracy was not affected by emotion in monkeys, but response times were faster for negative relative to positive images. Altogether, these results suggest shared mechanisms of social attention in humans and monkeys, with both species showing a pattern of selective attention toward threatening faces of conspecifics. These data are consistent with the view that selective vigilance to social threat is the result of evolutionary constraints. Yet, selective attention to threat was weaker in humans than in monkeys, suggesting that regulatory mechanisms enable non-anxious humans to reduce sensitivity to social threat in this paradigm, likely through enhanced prefrontal control and reduced amygdala activation. In addition, the findings emphasize important differences in attentional biases to social versus non-social threat in both species. Differences in the impact of emotional stimuli on recognition memory between monkeys and humans will require further study, as methodological differences in the recognition tasks may have affected the results.
Emotional recognition in depressed epilepsy patients.
Brand, Jesse G; Burton, Leslie A; Schaffer, Sarah G; Alper, Kenneth R; Devinsky, Orrin; Barr, William B
2009-07-01
The current study examined the relationship between emotional recognition and depression using the Minnesota Multiphasic Personality Inventory, Second Edition (MMPI-2), in a population with epilepsy. Participants were a mixture of surgical candidates in addition to those receiving neuropsychological testing as part of a comprehensive evaluation. Results suggested that patients with epilepsy reporting increased levels of depression (Scale D) performed better than those patients reporting low levels of depression on an index of simple facial recognition, and depression was associated with poor prosody discrimination. Further, it is notable that more than half of the present sample had significantly elevated Scale D scores. The potential effects of a mood-congruent bias and implications for social functioning in depressed patients with epilepsy are discussed.
Age differences in right-wing authoritarianism and their relation to emotion recognition.
Ruffman, Ted; Wilson, Marc; Henry, Julie D; Dawson, Abigail; Chen, Yan; Kladnitski, Natalie; Myftari, Ella; Murray, Janice; Halberstadt, Jamin; Hunter, John A
2016-03-01
This study examined the correlates of right-wing authoritarianism (RWA) in older adults. Participants were given tasks measuring emotion recognition, executive functions and fluid IQ and questionnaires measuring RWA, perceived threat and social dominance orientation. Study 1 established higher age-related RWA across the age span in more than 2,600 New Zealanders. Studies 2 to 4 found that threat, education, social dominance and age all predicted unique variance in older adults' RWA, but the most consistent predictor was emotion recognition, predicting unique variance in older adults' RWA independent of all other variables. We argue that older adults' worse emotion recognition is associated with a more general change in social judgment. Expression of extreme attitudes (right- or left-wing) has the potential to antagonize others, but worse emotion recognition means that subtle signals will not be perceived, making the expression of extreme attitudes more likely. Our findings are consistent with other studies showing that worsening emotion recognition underlies age-related declines in verbosity, understanding of social gaffes, and ability to detect lies. Such results indicate that emotion recognition is a core social insight linked to many aspects of social cognition. (c) 2016 APA, all rights reserved).
Facial emotion recognition in patients with focal and diffuse axonal injury.
Yassin, Walid; Callahan, Brandy L; Ubukata, Shiho; Sugihara, Genichi; Murai, Toshiya; Ueda, Keita
2017-01-01
Facial emotion recognition impairment has been well documented in patients with traumatic brain injury. Studies exploring the neural substrates involved in such deficits have implicated specific grey matter structures (e.g. orbitofrontal regions), as well as diffuse white matter damage. Our study aims to clarify whether different types of injuries (i.e. focal vs. diffuse) will lead to different types of impairments on facial emotion recognition tasks, as no study has directly compared these patients. The present study examined performance and response patterns on a facial emotion recognition task in 14 participants with diffuse axonal injury (DAI), 14 with focal injury (FI) and 22 healthy controls. We found that, overall, participants with FI and DAI performed more poorly than controls on the facial emotion recognition task. Further, we observed comparable emotion recognition performance in participants with FI and DAI, despite differences in the nature and distribution of their lesions. However, the rating response pattern between the patient groups was different. This is the first study to show that pure DAI, without gross focal lesions, can independently lead to facial emotion recognition deficits and that rating patterns differ depending on the type and location of trauma.
Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo
2011-01-01
In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423
NASA Astrophysics Data System (ADS)
Kasyidi, Fatan; Puji Lestari, Dessi
2018-03-01
One of the important aspects in human to human communication is to understand emotion of each party. Recently, interactions between human and computer continues to develop, especially affective interaction where emotion recognition is one of its important components. This paper presents our extended works on emotion recognition of Indonesian spoken language to identify four main class of emotions: Happy, Sad, Angry, and Contentment using combination of acoustic/prosodic features and lexical features. We construct emotion speech corpus from Indonesia television talk show where the situations are as close as possible to the natural situation. After constructing the emotion speech corpus, the acoustic/prosodic and lexical features are extracted to train the emotion model. We employ some machine learning algorithms such as Support Vector Machine (SVM), Naive Bayes, and Random Forest to get the best model. The experiment result of testing data shows that the best model has an F-measure score of 0.447 by using only the acoustic/prosodic feature and F-measure score of 0.488 by using both acoustic/prosodic and lexical features to recognize four class emotion using the SVM RBF Kernel.
Münkler, Paula; Rothkirch, Marcus; Dalati, Yasmin; Schmack, Katharina; Sterzer, Philipp
2015-01-01
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.
Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition
NASA Astrophysics Data System (ADS)
Kim, Jonghwa; André, Elisabeth
This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.
Speech emotion recognition methods: A literature review
NASA Astrophysics Data System (ADS)
Basharirad, Babak; Moradhaseli, Mohammadreza
2017-10-01
Recently, attention of the emotional speech signals research has been boosted in human machine interfaces due to availability of high computation capability. There are many systems proposed in the literature to identify the emotional state through speech. Selection of suitable feature sets, design of a proper classifications methods and prepare an appropriate dataset are the main key issues of speech emotion recognition systems. This paper critically analyzed the current available approaches of speech emotion recognition methods based on the three evaluating parameters (feature set, classification of features, accurately usage). In addition, this paper also evaluates the performance and limitations of available methods. Furthermore, it highlights the current promising direction for improvement of speech emotion recognition systems.
[Emotional facial expression recognition impairment in Parkinson disease].
Lachenal-Chevallet, Karine; Bediou, Benoit; Bouvard, Martine; Thobois, Stéphane; Broussolle, Emmanuel; Vighetto, Alain; Krolak-Salmon, Pierre
2006-03-01
some behavioral disturbances observed in Parkinson's disease (PD) could be related to impaired recognition of various social messages particularly emotional facial expressions. facial expression recognition was assessed using morphed faces (five emotions: happiness, fear, anger, disgust, neutral), and compared to gender recognition and general cognitive assessment in 12 patients with Parkinson's disease and 14 controls subjects. facial expression recognition was impaired among patients, whereas gender recognitions, visuo-perceptive capacities and total efficiency were preserved. Post hoc analyses disclosed a deficit for fear and disgust recognition compared to control subjects. the impairment of emotional facial expression recognition in PD appears independent of other cognitive deficits. This impairment may be related to the dopaminergic depletion in basal ganglia and limbic brain regions. They could take a part in psycho-behavioral disorders and particularly in communication disorders observed in Parkinson's disease patients.
van Marle, Hein J F; Hermans, Erno J; Qin, Shaozheng; Overeem, Sebastiaan; Fernández, Guillén
2013-09-01
A host of animal work demonstrates that the retention benefit for emotionally aversive over neutral memories is regulated by glucocorticoid action during memory consolidation. Particularly, glucocorticoids may affect systems-level processes that promote the gradual reorganization of emotional memory traces. These effects remain largely uninvestigated in humans. Therefore, in this functional magnetic resonance imaging study we administered hydrocortisone during a polysomnographically monitored night of sleep directly after healthy volunteers studied negative and neutral pictures in a double-blind, placebo-controlled, between-subjects design. The following evening memory consolidation was probed during a recognition memory test in the MR scanner by assessing the difference in brain activity associated with memory for the consolidated items studied before sleep and new, unconsolidated items studied shortly before test (remote vs. recent memory paradigm). Hydrocortisone administration resulted in elevated cortisol levels throughout the experimental night with no group difference at recent encoding or test. Behaviorally, we showed that cortisol enhanced the difference between emotional and neutral consolidated memory, effectively prioritizing emotional memory consolidation. On a neural level, we found that cortisol reduced amygdala reactivity related to the retrieval of these same consolidated, negative items. These findings show that cortisol administration during first post-encoding sleep had a twofold effect on the first 24h of emotional memory consolidation. While cortisol prioritized recognition memory for emotional items, it reduced reactivation of the neural circuitry underlying emotional responsiveness during retrieval. These findings fit recent theories on emotional depotentiation following consolidation during sleep, although future research should establish the sleep-dependence of this effect. Moreover, our data may shed light on mechanisms underlying potential therapeutic effects of cortisol administration following psychological trauma. Copyright © 2013 Elsevier Ltd. All rights reserved.
Immediate memory consequences of the effect of emotion on attention to pictures.
Talmi, Deborah; Anderson, Adam K; Riggs, Lily; Caplan, Jeremy B; Moscovitch, Morris
2008-03-01
Emotionally arousing stimuli are at once both highly attention grabbing and memorable. We examined whether emotional enhancement of memory (EEM) reflects an indirect effect of emotion on memory, mediated by enhanced attention to emotional items during encoding. We tested a critical prediction of the mediation hypothesis-that regions conjointly activated by emotion and attention would correlate with subsequent EEM. Participants were scanned with fMRI while they watched emotional or neutral pictures under instructions to attend to them a lot or a little, and were then given an immediate recognition test. A region in the left fusiform gyrus was activated by emotion, voluntary attention, and subsequent EEM. A functional network, different for each attention condition, connected this region and the amygdala, which was associated with emotion and EEM, but not with voluntary attention. These findings support an indirect cortical mediation account of immediate EEM that may complement a direct modulation model.
Immediate memory consequences of the effect of emotion on attention to pictures
Talmi, Deborah; Anderson, Adam K.; Riggs, Lily; Caplan, Jeremy B.; Moscovitch, Morris
2008-01-01
Emotionally arousing stimuli are at once both highly attention grabbing and memorable. We examined whether emotional enhancement of memory (EEM) reflects an indirect effect of emotion on memory, mediated by enhanced attention to emotional items during encoding. We tested a critical prediction of the mediation hypothesis—that regions conjointly activated by emotion and attention would correlate with subsequent EEM. Participants were scanned with fMRI while they watched emotional or neutral pictures under instructions to attend to them a lot or a little, and were then given an immediate recognition test. A region in the left fusiform gyrus was activated by emotion, voluntary attention, and subsequent EEM. A functional network, different for each attention condition, connected this region and the amygdala, which was associated with emotion and EEM, but not with voluntary attention. These findings support an indirect cortical mediation account of immediate EEM that may complement a direct modulation model. PMID:18323572
Emotion processing for arousal and neutral content in Alzheimer's disease.
Satler, Corina; Uribe, Carlos; Conde, Carlos; Da-Silva, Sergio Leme; Tomaz, Carlos
2010-02-01
Objective. To assess the ability of Alzheimer's disease (AD) patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels) were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P = .001). CG assigned different emotional scores for each version of the test, P = .001, while ratings of AD did not differ, P = .32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.
Feasibility Testing of a Wearable Behavioral Aid for Social Learning in Children with Autism.
Daniels, Jena; Haber, Nick; Voss, Catalin; Schwartz, Jessey; Tamura, Serena; Fazel, Azar; Kline, Aaron; Washington, Peter; Phillips, Jennifer; Winograd, Terry; Feinstein, Carl; Wall, Dennis P
2018-01-01
Recent advances in computer vision and wearable technology have created an opportunity to introduce mobile therapy systems for autism spectrum disorders (ASD) that can respond to the increasing demand for therapeutic interventions; however, feasibility questions must be answered first. We studied the feasibility of a prototype therapeutic tool for children with ASD using Google Glass, examining whether children with ASD would wear such a device, if providing the emotion classification will improve emotion recognition, and how emotion recognition differs between ASD participants and neurotypical controls (NC). We ran a controlled laboratory experiment with 43 children: 23 with ASD and 20 NC. Children identified static facial images on a computer screen with one of 7 emotions in 3 successive batches: the first with no information about emotion provided to the child, the second with the correct classification from the Glass labeling the emotion, and the third again without emotion information. We then trained a logistic regression classifier on the emotion confusion matrices generated by the two information-free batches to predict ASD versus NC. All 43 children were comfortable wearing the Glass. ASD and NC participants who completed the computer task with Glass providing audible emotion labeling ( n = 33) showed increased accuracies in emotion labeling, and the logistic regression classifier achieved an accuracy of 72.7%. Further analysis suggests that the ability to recognize surprise, fear, and neutrality may distinguish ASD cases from NC. This feasibility study supports the utility of a wearable device for social affective learning in ASD children and demonstrates subtle differences in how ASD and NC children perform on an emotion recognition task. Schattauer GmbH Stuttgart.
NASA Astrophysics Data System (ADS)
Harit, Aditya; Joshi, J. C., Col; Gupta, K. K.
2018-03-01
The paper proposed an automatic facial emotion recognition algorithm which comprises of two main components: feature extraction and expression recognition. The algorithm uses a Gabor filter bank on fiducial points to find the facial expression features. The resulting magnitudes of Gabor transforms, along with 14 chosen FAPs (Facial Animation Parameters), compose the feature space. There are two stages: the training phase and the recognition phase. Firstly, for the present 6 different emotions, the system classifies all training expressions in 6 different classes (one for each emotion) in the training stage. In the recognition phase, it recognizes the emotion by applying the Gabor bank to a face image, then finds the fiducial points, and then feeds it to the trained neural architecture.
Response Bias in "Remembering" Emotional Stimuli: A New Perspective on Age Differences
ERIC Educational Resources Information Center
Kapucu, Aycan; Rotello, Caren M.; Ready, Rebecca E.; Seidl, Katharina N.
2008-01-01
Older adults sometimes show a recall advantage for emotionally positive, rather than neutral or negative, stimuli (S. T. Charles, M. Mather, & L. L. Carstensen, 2003). In contrast, younger adults respond "old" and "remember" more often to negative materials in recognition tests. For younger adults, both effects are due to…
ERIC Educational Resources Information Center
Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus
2010-01-01
The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…
Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P
2016-03-01
Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps < 0.05). No differences were found on emotional face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p < 0.05). Further, compared to inpatients without generalized anxiety, those with generalized anxiety made fewer recognition errors on adult happy faces even when controlling for group status (p < 0.05). Adolescent inpatients engaged in NSSI showed greater deficits in emotional face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.
Alfimova, M V; Golimbet, V E; Korovaitseva, G I; Lezheiko, T V; Abramova, L I; Aksenova, E V; Bolgov, M I
2014-01-01
The 5-HTTLPR SLC6A4 and catechol-o-methyltransferase (COMT) Val158Met polymorphisms are reported to be associated with processing of facial expressions in general population. Impaired recognition of facial expressions that is characteristic of schizophrenia negatively impacts on the social adaptation of the patients. To search for molecular mechanisms of this deficit, we studied main and epistatic effects of 5-HTTLPR and Val158Met polymorphisms on the facial emotion recognition in patients with schizophrenia (n=299) and healthy controls (n=232). The 5-HTTLPR polymorphism was associated with the emotion recognition in patients. The ll-homozygotes recognized facial emotions significantly better compared to those with an s-allele (F=8.00; p=0.005). Although the recognition of facial emotions was correlated with negative symptoms, verbal learning and trait anxiety, these variables did not significantly modified the association. In both groups, no effect of the COMT on the recognition of facial emotions was found.
Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted
2017-05-01
Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Impaired recognition of scary music following unilateral temporal lobe excision.
Gosselin, Nathalie; Peretz, Isabelle; Noulhiane, Marion; Hasboun, Dominique; Beckett, Christine; Baulac, Michel; Samson, Séverine
2005-03-01
Music constitutes an ideal means to create a sense of suspense in films. However, there has been minimal investigation into the underlying cerebral organization for perceiving danger created by music. In comparison, the amygdala's role in recognition of fear in non-musical contexts has been well established. The present study sought to fill this gap in exploring how patients with amygdala resection recognize emotional expression in music. To this aim, we tested 16 patients with left (LTR; n = 8) or right (RTR; n = 8) medial temporal resection (including amygdala) for the relief of medically intractable seizures and 16 matched controls in an emotion recognition task involving instrumental music. The musical selections were purposely created to induce fear, peacefulness, happiness and sadness. Participants were asked to rate to what extent each musical passage expressed these four emotions on 10-point scales. In order to check for the presence of a perceptual problem, the same musical selections were presented to the participants in an error detection task. None of the patients was found to perform below controls in the perceptual task. In contrast, both LTR and RTR patients were found to be impaired in the recognition of scary music. Recognition of happy and sad music was normal. These findings suggest that the anteromedial temporal lobe (including the amygdala) plays a role in the recognition of danger in a musical context.
Fajrianthi; Zein, Rizqy Amelia
2017-01-01
This study aimed to develop an emotional intelligence (EI) test that is suitable to the Indonesian workplace context. Airlangga Emotional Intelligence Test (Tes Kecerdasan Emosi Airlangga [TKEA]) was designed to measure three EI domains: 1) emotional appraisal, 2) emotional recognition, and 3) emotional regulation. TKEA consisted of 120 items with 40 items for each subset. TKEA was developed based on the Situational Judgment Test (SJT) approach. To ensure its psychometric qualities, categorical confirmatory factor analysis (CCFA) and item response theory (IRT) were applied to test its validity and reliability. The study was conducted on 752 participants, and the results showed that test information function (TIF) was 3.414 (ability level = 0) for subset 1, 12.183 for subset 2 (ability level = -2), and 2.398 for subset 3 (level of ability = -2). It is concluded that TKEA performs very well to measure individuals with a low level of EI ability. It is worth to note that TKEA is currently at the development stage; therefore, in this study, we investigated TKEA's item analysis and dimensionality test of each TKEA subset.
Altering sensorimotor feedback disrupts visual discrimination of facial expressions.
Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula
2016-08-01
Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.
McCade, Donna; Savage, Greg; Guastella, Adam; Hickie, Ian B; Lewis, Simon J G; Naismith, Sharon L
2013-09-01
Impaired emotion recognition in dementia is associated with increased patient agitation, behavior management difficulties, and caregiver burden. Emerging evidence supports the presence of very early emotion recognition difficulties in mild cognitive impairment (MCI); however, the relationship between these impairments and psychosocial measures is not yet explored. Emotion recognition abilities of 27 patients with nonamnestic MCI (naMCI), 29 patients with amnestic MCI (aMCI), and 22 control participants were assessed. Self-report measures assessed patient functional disability, while informants rated the degree of burden they experienced. Difficulties in recognizing anger was evident in the amnestic subtype. Although both the patient groups reported greater social functioning disability, compared with the controls, a relationship between social dysfunction and anger recognition was evident only for patients with naMCI. A significant association was found between burden and anger recognition in patients with aMCI. Impaired emotion recognition abilities impact MCI subtypes differentially. Interventions targeted at patients with MCI, and caregivers are warranted.
Kessels, Roy P C; Montagne, Barbara; Hendriks, Angelique W; Perrett, David I; de Haan, Edward H F
2014-03-01
The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8-75. In children aged 8-17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age-related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression-based approach was adopted to present age- and education- or IQ-adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception. © 2013 The British Psychological Society.
ERIC Educational Resources Information Center
Golan, Ofer; Baron-Cohen, Simon; Golan, Yael
2008-01-01
Children with autism spectrum conditions (ASC) have difficulties recognizing others' emotions. Research has mostly focused on "basic" emotion recognition, devoid of context. This study reports the results of a new task, assessing recognition of "complex" emotions and mental states in social contexts. An ASC group (n = 23) was compared to a general…
Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton
2015-02-01
The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.
The recognition of facial emotion expressions in Parkinson's disease.
Assogna, Francesca; Pontieri, Francesco E; Caltagirone, Carlo; Spalletta, Gianfranco
2008-11-01
A limited number of studies in Parkinson's Disease (PD) suggest a disturbance of recognition of facial emotion expressions. In particular, disgust recognition impairment has been reported in unmedicated and medicated PD patients. However, the results are rather inconclusive in the definition of the degree and the selectivity of emotion recognition impairment, and an associated impairment of almost all basic facial emotions in PD is also described. Few studies have investigated the relationship with neuropsychiatric and neuropsychological symptoms with mainly negative results. This inconsistency may be due to many different problems, such as emotion assessment, perception deficit, cognitive impairment, behavioral symptoms, illness severity and antiparkinsonian therapy. Here we review the clinical characteristics and neural structures involved in the recognition of specific facial emotion expressions, and the plausible role of dopamine transmission and dopamine replacement therapy in these processes. It is clear that future studies should be directed to clarify all these issues.
2013-01-01
Background Relative to intentional memory encoding, which quickly declines in Mild Cognitive Impairment (MCI) and Alzheimer’s disease (AD), incidental memory for emotional stimuli appears to deteriorate more slowly. We hypothesised that tests of incidental emotional memory may inform on different aspects of cognitive decline in MCI and AD. Methods Patients with MCI, AD and Healthy Controls (HC) were asked to attend to emotional pictures (i.e., positive and neutral) sequentially presented during an fMRI session. Attention was monitored behaviourally. A surprise post-scan recognition test was then administered. Results The groups remained attentive within the scanner. The post-scan recognition pattern was in the form of (HC = MCI) > AD, with only the former group showing a clear benefit from emotional pictures. fMRI analysis of incidental encoding demonstrated clusters of activation in para-hippocampal regions and in the hippocampus in HC and MCI patients but not in AD patients. The pattern of activation observed in MCI patients tended to be greater than that found in HC. Conclusions The results suggest that incidental emotional memory might offer a suitable platform to investigate, using behavioural and fMRI measures, subtle changes in the process of developing AD. These changes seem to differ from those found using standard episodic memory tests. The underpinnings of such differences and the potential clinical use of this methodology are discussed in depth. PMID:23497150
Rigon, Arianna; Turkstra, Lyn S; Mutlu, Bilge; Duff, Melissa C
2018-05-01
To examine the relationship between facial-affect recognition and different aspects of self- and proxy-reported social-communication impairment following moderate-severe traumatic brain injury (TBI). Forty-six adults with chronic TBI (>6 months postinjury) and 42 healthy comparison (HC) adults were administered the La Trobe Communication Questionnaire (LCQ) Self and Other forms to assess different aspects of communication competence and the Emotion Recognition Test (ERT) to measure their ability to recognize facial affects. Individuals with TBI underperformed HC adults in the ERT and self-reported, as well as were reported by close others, as having more communication problems than did HC adults. TBI group ERT scores were significantly and negatively correlated with LCQ-Other (but not LCQ-Self) scores (i.e., participants with lower emotion-recognition scores were rated by close others as having more communication problems). Multivariate regression analysis revealed that adults with higher ERT scores self-reported more problems with disinhibition-impulsivity and partner sensitivity and had fewer other-reported problems with disinhibition-impulsivity and conversational effectiveness. Our findings support growing evidence that emotion-recognition deficits play a role in specific aspects of social-communication outcomes after TBI and should be considered in treatment planning. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Hoffmann, Holger; Kessler, Henrik; Eppel, Tobias; Rukavina, Stefanie; Traue, Harald C
2010-11-01
Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli. Copyright © 2010 Elsevier B.V. All rights reserved.
Rodrigo-Ruiz, D; Perez-Gonzalez, J C; Cejudo, J
2017-08-16
It has recently been warned that children with attention deficit hyperactivity disorder (ADHD) show a deficit in emotional competence and emotional intelligence, specifically in their ability to emotional recognition. A systematic review of the scientific literature in reference to the emotional recognition of facial expressions in children with ADHD is presented in order to establish or rule the existence of emotional deficits as primary dysfunction in this disorder and, where appropriate, the effect size of the differences against normal development or neurotypical children. The results reveal the recent interest in the issue and the lack of information. Although there is no complete agreement, most of the studies show that emotional recognition of facial expressions is affected in children with ADHD, showing them significantly less accurate than children from control groups in recognizing emotions communicated through facial expressions. A part of these studies make comparisons on the recognition of different discrete emotions; having observed that children with ADHD tend to a greater difficulty recognizing negative emotions, especially anger, fear, and disgust. These results have direct implications for the educational and clinical diagnosis of ADHD; and for the educational intervention for children with ADHD, emotional education might entail an advantageous aid.
Influences on Facial Emotion Recognition in Deaf Children
ERIC Educational Resources Information Center
Sidera, Francesc; Amadó, Anna; Martínez, Laura
2017-01-01
This exploratory research is aimed at studying facial emotion recognition abilities in deaf children and how they relate to linguistic skills and the characteristics of deafness. A total of 166 participants (75 deaf) aged 3-8 years were administered the following tasks: facial emotion recognition, naming vocabulary and cognitive ability. The…
The Differential Effects of Thalamus and Basal Ganglia on Facial Emotion Recognition
ERIC Educational Resources Information Center
Cheung, Crystal C. Y.; Lee, Tatia M. C.; Yip, James T. H.; King, Kristin E.; Li, Leonard S. W.
2006-01-01
This study examined if subcortical stroke was associated with impaired facial emotion recognition. Furthermore, the lateralization of the impairment and the differential profiles of facial emotion recognition deficits with localized thalamic or basal ganglia damage were also studied. Thirty-eight patients with subcortical strokes and 19 matched…
Towards Real-Time Speech Emotion Recognition for Affective E-Learning
ERIC Educational Resources Information Center
Bahreini, Kiavash; Nadolski, Rob; Westera, Wim
2016-01-01
This paper presents the voice emotion recognition part of the FILTWAM framework for real-time emotion recognition in affective e-learning settings. FILTWAM (Framework for Improving Learning Through Webcams And Microphones) intends to offer timely and appropriate online feedback based upon learner's vocal intonations and facial expressions in order…
Emotion Recognition Abilities and Empathy of Victims of Bullying
ERIC Educational Resources Information Center
Woods, Sarah; Wolke, Dieter; Nowicki, Stephen; Hall, Lynne
2009-01-01
Objectives: Bullying is a form of systematic abuse by peers with often serious consequences for victims. Few studies have considered the role of emotion recognition abilities and empathic behaviour for different bullying roles. This study investigated physical and relational bullying involvement in relation to basic emotion recognition abilities,…
Facial recognition in education system
NASA Astrophysics Data System (ADS)
Krithika, L. B.; Venkatesh, K.; Rathore, S.; Kumar, M. Harish
2017-11-01
Human beings exploit emotions comprehensively for conveying messages and their resolution. Emotion detection and face recognition can provide an interface between the individuals and technologies. The most successful applications of recognition analysis are recognition of faces. Many different techniques have been used to recognize the facial expressions and emotion detection handle varying poses. In this paper, we approach an efficient method to recognize the facial expressions to track face points and distances. This can automatically identify observer face movements and face expression in image. This can capture different aspects of emotion and facial expressions.
Zheng, Leilei; Chai, Hao; Chen, Wanzhen; Yu, Rongrong; He, Wei; Jiang, Zhengyan; Yu, Shaohua; Li, Huichun; Wang, Wei
2011-12-01
Early parental bonding experiences play a role in emotion recognition and expression in later adulthood, and patients with personality disorder frequently experience inappropriate parental bonding styles, therefore the aim of the present study was to explore whether parental bonding style is correlated with recognition of facial emotion in personality disorder patients. The Parental Bonding Instrument (PBI) and the Matsumoto and Ekman Japanese and Caucasian Facial Expressions of Emotion (JACFEE) photo set tests were carried out in 289 participants. Patients scored lower on parental Care but higher on parental Freedom Control and Autonomy Denial subscales, and they displayed less accuracy when recognizing contempt, disgust and happiness than the healthy volunteers. In healthy volunteers, maternal Autonomy Denial significantly predicted accuracy when recognizing fear, and maternal Care predicted the accuracy of recognizing sadness. In patients, paternal Care negatively predicted the accuracy of recognizing anger, paternal Freedom Control predicted the perceived intensity of contempt, maternal Care predicted the accuracy of recognizing sadness, and the intensity of disgust. Parenting bonding styles have an impact on the decoding process and sensitivity when recognizing facial emotions, especially in personality disorder patients. © 2011 The Authors. Psychiatry and Clinical Neurosciences © 2011 Japanese Society of Psychiatry and Neurology.
Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T
2012-12-01
Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize the mouth feature holistically and the eyes as isolated parts. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.
Uskul, Ayse K; Paulmann, Silke; Weick, Mario
2016-02-01
Listeners have to pay close attention to a speaker's tone of voice (prosody) during daily conversations. This is particularly important when trying to infer the emotional state of the speaker. Although a growing body of research has explored how emotions are processed from speech in general, little is known about how psychosocial factors such as social power can shape the perception of vocal emotional attributes. Thus, the present studies explored how social power affects emotional prosody recognition. In a correlational study (Study 1) and an experimental study (Study 2), we show that high power is associated with lower accuracy in emotional prosody recognition than low power. These results, for the first time, suggest that individuals experiencing high or low power perceive emotional tone of voice differently. (c) 2016 APA, all rights reserved).
Hindocha, Chandni; Freeman, Tom P; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K; Morgan, Celia J A; Curran, H Valerie
2015-03-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8mg), CBD (16mg), THC+CBD (8mg+16mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling 'stoned' was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being 'stoned'. CBD did not influence feelings of 'stoned'. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Hindocha, Chandni; Freeman, Tom P.; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K.; Morgan, Celia J.A.; Curran, H. Valerie
2015-01-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8 mg), CBD (16 mg), THC+CBD (8 mg+16 mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling ‘stoned’ was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being ‘stoned’. CBD did not influence feelings of ‘stoned’. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. PMID:25534187
Emotion Recognition in Preschool Children: Associations with Maternal Depression and Early Parenting
Kujawa, Autumn; Dougherty, Lea; Durbin, C. Emily; Laptook, Rebecca; Torpey, Dana; Klein, Daniel N.
2013-01-01
Emotion knowledge in childhood has been shown to predict social functioning and psychological well-being, but relatively little is known about parental factors that influence its development in early childhood. There is some evidence that both parenting behavior and maternal depression are associated with emotion recognition, but previous research has only examined these factors independently. The current study assessed auditory and visual emotion recognition ability among a large sample of preschool children to examine typical emotion recognition skills in children of this age, as well as the independent and interactive effects of maternal and paternal depression and negative parenting (i.e., hostility and intrusiveness). Results indicated that children were most accurate at identifying happy emotional expressions, followed by other basic emotions. The lowest accuracy was observed for neutral expressions. A significant interaction was found between maternal depression and negative parenting behavior, such that children with a maternal history of depression were particularly sensitive to the negative effects of maladaptive parenting behavior on emotion recognition ability. No significant effects were found for paternal depression. These results highlight the importance of examining the effects of multiple interacting factors on children’s emotional development, and provide suggestions for identifying children for targeted preventive interventions. PMID:24444174
Herba, Catherine; Phillips, Mary
2004-10-01
Intact emotion processing is critical for normal emotional development. Recent advances in neuroimaging have facilitated the examination of brain development, and have allowed for the exploration of the relationships between the development of emotion processing abilities, and that of associated neural systems. A literature review was performed of published studies examining the development of emotion expression recognition in normal children and psychiatric populations, and of the development of neural systems important for emotion processing. Few studies have explored the development of emotion expression recognition throughout childhood and adolescence. Behavioural studies suggest continued development throughout childhood and adolescence (reflected by accuracy scores and speed of processing), which varies according to the category of emotion displayed. Factors such as sex, socio-economic status, and verbal ability may also affect this development. Functional neuroimaging studies in adults highlight the role of the amygdala in emotion processing. Results of the few neuroimaging studies in children have focused on the role of the amygdala in the recognition of fearful expressions. Although results are inconsistent, they provide evidence throughout childhood and adolescence for the continued development of and sex differences in amygdalar function in response to fearful expressions. Studies exploring emotion expression recognition in psychiatric populations of children and adolescents suggest deficits that are specific to the type of disorder and to the emotion displayed. Results from behavioural and neuroimaging studies indicate continued development of emotion expression recognition and neural regions important for this process throughout childhood and adolescence. Methodological inconsistencies and disparate findings make any conclusion difficult, however. Further studies are required examining the relationship between the development of emotion expression recognition and that of underlying neural systems, in particular subcortical and prefrontal cortical structures. These will inform understanding of the neural bases of normal and abnormal emotional development, and aid the development of earlier interventions for children and adolescents with psychiatric disorders.
Memory, emotion, and pupil diameter: Repetition of natural scenes.
Bradley, Margaret M; Lang, Peter J
2015-09-01
Recent studies have suggested that pupil diameter, like the "old-new" ERP, may be a measure of memory. Because the amplitude of the old-new ERP is enhanced for items encoded in the context of repetitions that are distributed (spaced), compared to massed (contiguous), we investigated whether pupil diameter is similarly sensitive to repetition. Emotional and neutral pictures of natural scenes were viewed once or repeated with massed (contiguous) or distributed (spaced) repetition during incidental free viewing and then tested on an explicit recognition test. Although an old-new difference in pupil diameter was found during successful recognition, pupil diameter was not enhanced for distributed, compared to massed, repetitions during either recognition or initial free viewing. Moreover, whereas a significant old-new difference was found for erotic scenes that had been seen only once during encoding, this difference was absent when erotic scenes were repeated. Taken together, the data suggest that pupil diameter is not a straightforward index of prior occurrence for natural scenes. © 2015 Society for Psychophysiological Research.
ERIC Educational Resources Information Center
Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato
2010-01-01
The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…
Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M
2015-02-01
Noradrenaline interacts with stress hormones in the amygdala and hippocampus to enhance emotional memory consolidation, but the noradrenergic-glucocorticoid interaction at retrieval, where stress impairs memory, is less understood. We used a genetic neuroimaging approach to investigate whether a genetic variation of the noradrenergic system impacts stress-induced neural activity in amygdala and hippocampus during recognition of emotional memory. This study is based on genotype-dependent reanalysis of data from our previous publication (Li et al. Brain Imaging Behav 2014). Twenty-two healthy male volunteers were genotyped for the ADRA2B gene encoding the α2B-adrenergic receptor. Ten deletion carriers and 12 noncarriers performed an emotional face recognition task, while their brain activity was measured with fMRI. During encoding, 50 fearful and 50 neutral faces were presented. One hour later, they underwent either an acute stress (Trier Social Stress Test) or a control procedure which was followed immediately by the retrieval session, where participants had to discriminate between 100 old and 50 new faces. A genotype-dependent modulation of neural activity at retrieval was found in the bilateral amygdala and right hippocampus. Deletion carriers showed decreased neural activity in the amygdala when recognizing emotional faces in control condition and increased amygdala activity under stress. Noncarriers showed no differences in emotional modulated amygdala activation under stress or control. Instead, stress-induced increases during recognition of emotional faces were present in the right hippocampus. The genotype-dependent effects of acute stress on neural activity in amygdala and hippocampus provide evidence for noradrenergic-glucocorticoid interaction in emotional memory retrieval.
Recognition profile of emotions in natural and virtual faces.
Dyck, Miriam; Winbeck, Maren; Leiberg, Susanne; Chen, Yuhan; Gur, Ruben C; Gur, Rurben C; Mathiak, Klaus
2008-01-01
Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups. Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition. Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.
Recognition Profile of Emotions in Natural and Virtual Faces
Dyck, Miriam; Winbeck, Maren; Leiberg, Susanne; Chen, Yuhan; Gur, Rurben C.; Mathiak, Klaus
2008-01-01
Background Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups. Methodology/Principal Findings Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition. Conclusions/Significance Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications. PMID:18985152
Embodied emotion impairment in Huntington's Disease.
Trinkler, Iris; Devignevielle, Sévérine; Achaibou, Amal; Ligneul, Romain V; Brugières, Pierre; Cleret de Langavant, Laurent; De Gelder, Beatrice; Scahill, Rachael; Schwartz, Sophie; Bachoud-Lévi, Anne-Catherine
2017-07-01
Theories of embodied cognition suggest that perceiving an emotion involves somatovisceral and motoric re-experiencing. Here we suggest taking such an embodied stance when looking at emotion processing deficits in patients with Huntington's Disease (HD), a neurodegenerative motor disorder. The literature on these patients' emotion recognition deficit has recently been enriched by some reports of impaired emotion expression. The goal of the study was to find out if expression deficits might be linked to a more motoric level of impairment. We used electromyography (EMG) to compare voluntary emotion expression from words to emotion imitation from static face images, and spontaneous emotion mimicry in 28 HD patients and 24 matched controls. For the latter two imitation conditions, an underlying emotion understanding is not imperative (even though performance might be helped by it). EMG measures were compared to emotion recognition and to the capacity to identify and describe emotions using alexithymia questionnaires. Alexithymia questionnaires tap into the more somato-visceral or interoceptive aspects of emotion perception. Furthermore, we correlated patients' expression and recognition scores to cerebral grey matter volume using voxel-based morphometry (VBM). EMG results replicated impaired voluntary emotion expression in HD. Critically, voluntary imitation and spontaneous mimicry were equally impaired and correlated with impaired recognition. By contrast, alexithymia scores were normal, suggesting that emotion representations on the level of internal experience might be spared. Recognition correlated with brain volume in the caudate as well as in areas previously associated with shared action representations, namely somatosensory, posterior parietal, posterior superior temporal sulcus (pSTS) and subcentral sulcus. Together, these findings indicate that in these patients emotion deficits might be tied to the "motoric level" of emotion expression. Such a double-sided recognition and expression impairment may have important consequences, interrupting empathy in nonverbal communication both ways (understanding and being understood), independently of intact internal experience of emotion. Copyright © 2017 Elsevier Ltd. All rights reserved.
Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns
Noh, Soo Rim; Isaacowitz, Derek M.
2014-01-01
While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713
Preti, Emanuele; Richetin, Juliette; Suttora, Chiara; Pisani, Alberto
2016-04-30
Dysfunctions in social cognition characterize personality disorders. However, mixed results emerged from literature on emotion processing. Borderline Personality Disorder (BPD) traits are either associated with enhanced emotion recognition, impairments, or equal functioning compared to controls. These apparent contradictions might result from the complexity of emotion recognition tasks used and from individual differences in impulsivity and effortful control. We conducted a study in a sample of undergraduate students (n=80), assessing BPD traits, using an emotion recognition task that requires the processing of only visual information or both visual and acoustic information. We also measured individual differences in impulsivity and effortful control. Results demonstrated the moderating role of some components of impulsivity and effortful control on the capability of BPD traits in predicting anger and happiness recognition. We organized the discussion around the interaction between different components of regulatory functioning and task complexity for a better understanding of emotion recognition in BPD samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C
2014-01-01
The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS. PMID:24067301
Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C
2014-02-01
The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS.
Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆
Cao, Houwei; Verma, Ragini; Nenkova, Ani
2014-01-01
We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion. PMID:25422534
Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆
Cao, Houwei; Verma, Ragini; Nenkova, Ani
2015-01-01
We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion.
Intranasal oxytocin improves emotion recognition for youth with autism spectrum disorders.
Guastella, Adam J; Einfeld, Stewart L; Gray, Kylie M; Rinehart, Nicole J; Tonge, Bruce J; Lambert, Timothy J; Hickie, Ian B
2010-04-01
A diagnostic hallmark of autism spectrum disorders is a qualitative impairment in social communication and interaction. Deficits in the ability to recognize the emotions of others are believed to contribute to this. There is currently no effective treatment for these problems. In a double-blind, randomized, placebo-controlled, crossover design, we administered oxytocin nasal spray (18 or 24 IU) or a placebo to 16 male youth aged 12 to 19 who were diagnosed with Autistic or Asperger's Disorder. Participants then completed the Reading the Mind in the Eyes Task, a widely used and reliable test of emotion recognition. In comparison with placebo, oxytocin administration improved performance on the Reading the Mind in the Eyes Task. This effect was also shown when analysis was restricted to the younger participants aged 12 to 15 who received the lower dose. This study provides the first evidence that oxytocin nasal spray improves emotion recognition in young people diagnosed with autism spectrum disorders. Findings suggest the potential of earlier intervention and further evaluation of oxytocin nasal spray as a treatment to improve social communication and interaction in young people with autism spectrum disorders. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Dmitrieva, E S; Gel'man, V Ia
2011-01-01
The listener-distinctive features of recognition of different emotional intonations (positive, negative and neutral) of male and female speakers in the presence or absence of background noise were studied in 49 adults aged 20-79 years. In all the listeners noise produced the most pronounced decrease in recognition accuracy for positive emotional intonation ("joy") as compared to other intonations, whereas it did not influence the recognition accuracy of "anger" in 65-79-year-old listeners. The higher emotion recognition rates of a noisy signal were observed for speech emotional intonations expressed by female speakers. Acoustic characteristics of noisy and clear speech signals underlying perception of speech emotional prosody were found for adult listeners of different age and gender.
Associations between facial emotion recognition and young adolescents’ behaviors in bullying
Gini, Gianluca; Altoè, Gianmarco
2017-01-01
This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years) who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students’ behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a “neutral” nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals. PMID:29131871
Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo
2012-01-01
The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446
Valence and the development of immediate and long-term false memory illusions.
Howe, Mark L; Candel, Ingrid; Otgaar, Henry; Malone, Catherine; Wimmer, Marina C
2010-01-01
Across five experiments we examined the role of valence in children's and adults' true and false memories. Using the Deese/Roediger-McDermott paradigm and either neutral or negative-emotional lists, both adults' (Experiment 1) and children's (Experiment 2) true recall and recognition was better for neutral than negative items, and although false recall was also higher for neutral items, false recognition was higher for negative items. The last three experiments examined adults' (Experiment 3) and children's (Experiments 4 and 5) 1-week long-term recognition of neutral and negative-emotional information. The results replicated the immediate recall and recognition findings from the first two experiments. More important, these experiments showed that although true recognition decreased over the 1-week interval, false recognition of neutral items remained unchanged whereas false recognition of negative-emotional items increased. These findings are discussed in terms of theories of emotion and memory as well as their forensic implications.
Ventura, Joseph; Wood, Rachel C.; Jimenez, Amy M.; Hellemann, Gerhard S.
2014-01-01
Background In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? Methods A meta-analysis of 102 studies (combined n = 4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Results Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r = .51). In addition, the relationship between FR and EP through voice prosody (r = .58) is as strong as the relationship between FR and EP based on facial stimuli (r = .53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality – facial stimuli and voice prosody. Discussion The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. PMID:24268469
Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S
2013-12-01
In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.
Improvement of emotional healthcare system with stress detection from ECG signal.
Tivatansakul, S; Ohkura, M
2015-01-01
Our emotional healthcare system is designed to cope with users' negative emotions in daily life. To make the system more intelligent, we integrated emotion recognition by facial expression to provide appropriate services based on user's current emotional state. Our emotion recognition by facial expression has confusion issue to recognize some positive, neutral and negative emotions that make the emotional healthcare system provide a relaxation service even though users don't have negative emotions. Therefore, to increase the effectiveness of the system to provide the relaxation service, we integrate stress detection from ECG signal. The stress detection might be able to address the confusion issue of emotion recognition by facial expression to provide the service. Indeed, our results show that integration of stress detection increases the effectiveness and efficiency of the emotional healthcare system to provide services.
Normal-Hearing Listeners’ and Cochlear Implant Users’ Perception of Pitch Cues in Emotional Speech
Fuller, Christina; Gilbers, Dicky; Broersma, Mirjam; Goudbeek, Martijn; Free, Rolien; Başkent, Deniz
2015-01-01
In cochlear implants (CIs), acoustic speech cues, especially for pitch, are delivered in a degraded form. This study’s aim is to assess whether due to degraded pitch cues, normal-hearing listeners and CI users employ different perceptual strategies to recognize vocal emotions, and, if so, how these differ. Voice actors were recorded pronouncing a nonce word in four different emotions: anger, sadness, joy, and relief. These recordings’ pitch cues were phonetically analyzed. The recordings were used to test 20 normal-hearing listeners’ and 20 CI users’ emotion recognition. In congruence with previous studies, high-arousal emotions had a higher mean pitch, wider pitch range, and more dominant pitches than low-arousal emotions. Regarding pitch, speakers did not differentiate emotions based on valence but on arousal. Normal-hearing listeners outperformed CI users in emotion recognition, even when presented with CI simulated stimuli. However, only normal-hearing listeners recognized one particular actor’s emotions worse than the other actors’. The groups behaved differently when presented with similar input, showing that they had to employ differing strategies. Considering the respective speaker’s deviating pronunciation, it appears that for normal-hearing listeners, mean pitch is a more salient cue than pitch range, whereas CI users are biased toward pitch range cues. PMID:27648210
ERP evidence for the recognition of emotional prosody through simulated cochlear implant strategies.
Agrawal, Deepashri; Timm, Lydia; Viola, Filipa Campos; Debener, Stefan; Büchner, Andreas; Dengler, Reinhard; Wittfoth, Matthias
2012-09-20
Emotionally salient information in spoken language can be provided by variations in speech melody (prosody) or by emotional semantics. Emotional prosody is essential to convey feelings through speech. In sensori-neural hearing loss, impaired speech perception can be improved by cochlear implants (CIs). Aim of this study was to investigate the performance of normal-hearing (NH) participants on the perception of emotional prosody with vocoded stimuli. Semantically neutral sentences with emotional (happy, angry and neutral) prosody were used. Sentences were manipulated to simulate two CI speech-coding strategies: the Advance Combination Encoder (ACE) and the newly developed Psychoacoustic Advanced Combination Encoder (PACE). Twenty NH adults were asked to recognize emotional prosody from ACE and PACE simulations. Performance was assessed using behavioral tests and event-related potentials (ERPs). Behavioral data revealed superior performance with original stimuli compared to the simulations. For simulations, better recognition for happy and angry prosody was observed compared to the neutral. Irrespective of simulated or unsimulated stimulus type, a significantly larger P200 event-related potential was observed for happy prosody after sentence onset than the other two emotions. Further, the amplitude of P200 was significantly more positive for PACE strategy use compared to the ACE strategy. Results suggested P200 peak as an indicator of active differentiation and recognition of emotional prosody. Larger P200 peak amplitude for happy prosody indicated importance of fundamental frequency (F0) cues in prosody processing. Advantage of PACE over ACE highlighted a privileged role of the psychoacoustic masking model in improving prosody perception. Taken together, the study emphasizes on the importance of vocoded simulation to better understand the prosodic cues which CI users may be utilizing.
Mendlewicz, L; Nef, F; Simon, Y
2001-01-01
Several studies have been carried out using the Stroop test in eating disorders. Some of these studies have brought to light the existence of cognitive and attention deficits linked principally to weight and to food in anorexic and bulimic patients. The aim of the current study is to replicate and to clarify the existence of cognitive and attention deficits in anorexic patients using the Stroop test and a word recognition test. The recognition test is made up of 160 words; 80 words from the previous Stroop experiment mixed at random and matched from a semantic point of view to 80 distractions. The recognition word test is carried out 2 or 3 days after the Stroop test. Thirty-two subjects took part in the study: 16 female patients hospitalised for anorexia nervosa and 16 normal females as controls. Our results do not enable us to confirm the existence of specific cognitive deficits in anorexic patients. Copyright 2001 S. Karger AG, Basel
Bechtoldt, Myriam N; Schneider, Vanessa K
2016-09-01
While emotional intelligence (EI) is recognized as a resource in social interactions, we hypothesized a positive association with stress in socially evaluative contexts. In particular, we expected emotion recognition, the core component of EI, to inflict stress on individuals in negatively valenced interactions. We expected this association to be stronger for status-driven individuals, that is, for individuals scoring high on basal testosterone. In a laboratory experiment, N = 166 male participants underwent the Trier Social Stress Test (Kirschbaum, Pirke, & Hellhammer, 1993). As expected, EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT V2.0; Mayer et al., 2003) predicted higher cortisol reactivity, including slower recovery from stress. The effect was moderated by basal testosterone, such that the association was positive when basal testosterone was high but not when it was low. On the component level of EI, the interaction was replicated for negative emotion recognition. These findings lend support to the hypothesis that EI is associated with higher activity of the hypothalamic-pituitary-adrenal axis in contexts where social status is at stake, particularly for those individuals who are more status-driven. Thus, the effects of EI are not unequivocally positive: While EI may positively affect the course of social interactions, it also inflicts stress on the emotionally intelligent individuals themselves. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Does Facial Expression Recognition Provide a Toehold for the Development of Emotion Understanding?
ERIC Educational Resources Information Center
Strand, Paul S.; Downs, Andrew; Barbosa-Leiker, Celestina
2016-01-01
The authors explored predictions from basic emotion theory (BET) that facial emotion expression recognition skills are insular with respect to their own development, and yet foundational to the development of emotional perspective-taking skills. Participants included 417 preschool children for whom estimates of these 2 emotion understanding…
Jürgens, Rebecca; Grass, Annika; Drolet, Matthis; Fischer, Julia
Both in the performative arts and in emotion research, professional actors are assumed to be capable of delivering emotions comparable to spontaneous emotional expressions. This study examines the effects of acting training on vocal emotion depiction and recognition. We predicted that professional actors express emotions in a more realistic fashion than non-professional actors. However, professional acting training may lead to a particular speech pattern; this might account for vocal expressions by actors that are less comparable to authentic samples than the ones by non-professional actors. We compared 80 emotional speech tokens from radio interviews with 80 re-enactments by professional and inexperienced actors, respectively. We analyzed recognition accuracies for emotion and authenticity ratings and compared the acoustic structure of the speech tokens. Both play-acted conditions yielded similar recognition accuracies and possessed more variable pitch contours than the spontaneous recordings. However, professional actors exhibited signs of different articulation patterns compared to non-trained speakers. Our results indicate that for emotion research, emotional expressions by professional actors are not better suited than those from non-actors.
Demirel, Husrev; Yesilbas, Dilek; Ozver, Ismail; Yuksek, Erhan; Sahin, Feyzi; Aliustaoglu, Suheyla; Emul, Murat
2014-04-01
It is well known that patients with bipolar disorder are more prone to violence and have more criminal behaviors than general population. A strong relationship between criminal behavior and inability to empathize and imperceptions to other person's feelings and facial expressions increases the risk of delinquent behaviors. In this study, we aimed to investigate the deficits of facial emotion recognition ability in euthymic bipolar patients who committed an offense and compare with non-delinquent euthymic patients with bipolar disorder. Fifty-five euthymic patients with delinquent behaviors and 54 non-delinquent euthymic bipolar patients as a control group were included in the study. Ekman's Facial Emotion Recognition Test, sociodemographic data, Hare Psychopathy Checklist, Hamilton Depression Rating Scale and Young Mania Rating Scale were applied to both groups. There were no significant differences between case and control groups in the meaning of average age, gender, level of education, mean age onset of disease and suicide attempt (p>0.05). The three types of most committed delinquent behaviors in patients with euthymic bipolar disorder were as follows: injury (30.8%), threat or insult (20%) and homicide (12.7%). The best accurate percentage of identified facial emotion was "happy" (>99%, for both) while the worst misidentified facial emotion was "fear" in both groups (<50%, for both). The total accuracy rate of recognition toward facial emotions was significantly impaired in patients with delinquent behaviors than non-delinquent ones (p<0.05). The accuracy rate of recognizing the fear expressions was significantly worse in the case group than in the control group (p<0.05). In addition, it tended to be worse toward angry facial expressions in criminal euthymic bipolar patients. The response times toward happy, fear, disgusted and angry expressions had been significantly longer in the case group than in the control group (p<0.05). This study is the first, searching the ability of facial emotion recognition in euthymic patients with bipolar disorder who had delinquent behaviors. We have shown that patients with bipolar disorder who had delinquent behaviors may have some social interaction problems i.e., misrecognizing fearful and modestly anger facial emotions and need some more time to response facial emotions even in remission. Copyright © 2014 Elsevier Inc. All rights reserved.
Kempnich, Clare L; Wong, Dana; Georgiou-Karistianis, Nellie; Stout, Julie C
2017-04-01
Deficits in the recognition of negative emotions emerge before clinical diagnosis in Huntington's disease (HD). To address emotion recognition deficits, which have been shown in schizophrenia to be improved by computerized training, we conducted a study of the feasibility and efficacy of computerized training of emotion recognition in HD. We randomly assigned 22 individuals with premanifest or early symptomatic HD to the training or control group. The training group used a self-guided online training program, MicroExpression Training Tool (METT), twice weekly for 4 weeks. All participants completed measures of emotion recognition at baseline and post-training time-points. Participants in the training group also completed training adherence measures. Participants in the training group completed seven of the eight sessions on average. Results showed a significant group by time interaction, indicating that METT training was associated with improved accuracy in emotion recognition. Although sample size was small, our study demonstrates that emotion recognition remediation using the METT is feasible in terms of training adherence. The evidence also suggests METT may be effective in premanifest or early-symptomatic HD, opening up a potential new avenue for intervention. Further study with a larger sample size is needed to replicate these findings, and to characterize the durability and generalizability of these improvements, and their impact on functional outcomes in HD. (JINS, 2017, 23, 314-321).
Encoding conditions affect recognition of vocally expressed emotions across cultures.
Jürgens, Rebecca; Drolet, Matthis; Pirow, Ralph; Scheiner, Elisabeth; Fischer, Julia
2013-01-01
Although the expression of emotions in humans is considered to be largely universal, cultural effects contribute to both emotion expression and recognition. To disentangle the interplay between these factors, play-acted and authentic (non-instructed) vocal expressions of emotions were used, on the assumption that cultural effects may contribute differentially to the recognition of staged and spontaneous emotions. Speech tokens depicting four emotions (anger, sadness, joy, fear) were obtained from German radio archives and re-enacted by professional actors, and presented to 120 participants from Germany, Romania, and Indonesia. Participants in all three countries were poor at distinguishing between play-acted and spontaneous emotional utterances (58.73% correct on average with only marginal cultural differences). Nevertheless, authenticity influenced emotion recognition: across cultures, anger was recognized more accurately when play-acted (z = 15.06, p < 0.001) and sadness when authentic (z = 6.63, p < 0.001), replicating previous findings from German populations. German subjects revealed a slight advantage in recognizing emotions, indicating a moderate in-group advantage. There was no difference between Romanian and Indonesian subjects in the overall emotion recognition. Differential cultural effects became particularly apparent in terms of differential biases in emotion attribution. While all participants labeled play-acted expressions as anger more frequently than expected, German participants exhibited a further bias toward choosing anger for spontaneous stimuli. In contrast to the German sample, Romanian and Indonesian participants were biased toward choosing sadness. These results support the view that emotion recognition rests on a complex interaction of human universals and cultural specificities. Whether and in which way the observed biases are linked to cultural differences in self-construal remains an issue for further investigation.
Mineralocorticoid receptor haplotype, estradiol, progesterone and emotional information processing.
Hamstra, Danielle A; de Kloet, E Ronald; Quataert, Ina; Jansen, Myrthe; Van der Does, Willem
2017-02-01
Carriers of MR-haplotype 1 and 3 (GA/CG; rs5522 and rs2070951) are more sensitive to the influence of oral contraceptives (OC) and menstrual cycle phase on emotional information processing than MR-haplotype 2 (CA) carriers. We investigated whether this effect is associated with estradiol (E2) and/or progesterone (P4) levels. Healthy MR-genotyped premenopausal women were tested twice in a counterbalanced design. Naturally cycling (NC) women were tested in the early-follicular and mid-luteal phase and OC-users during OC-intake and in the pill-free week. At both sessions E2 and P4 were assessed in saliva. Tests included implicit and explicit positive and negative affect, attentional blink accuracy, emotional memory, emotion recognition, and risky decision-making (gambling). MR-haplotype 2 homozygotes had higher implicit happiness scores than MR-haplotype 2 heterozygotes (p=0.031) and MR-haplotype 1/3 carriers (p<0.001). MR-haplotype 2 homozygotes also had longer reaction times to happy faces in an emotion recognition test than MR-haplotype 1/3 (p=0.001). Practice effects were observed for most measures. The pattern of correlations between information processing and P4 or E2 differed between sessions, as well as the moderating effects of the MR genotype. In the first session the MR-genotype moderated the influence of P4 on implicit anxiety (sr=-0.30; p=0.005): higher P4 was associated with reduction in implicit anxiety, but only in MR-haplotype 2 homozygotes (sr=-0.61; p=0.012). In the second session the MR-genotype moderated the influence of E2 on the recognition of facial expressions of happiness (sr=-0.21; p=0.035): only in MR-haplotype 1/3 higher E2 was correlated with happiness recognition (sr=0.29; p=0.005). In the second session higher E2 and P4 were negatively correlated with accuracy in lag2 trials of the attentional blink task (p<0.001). Thus NC women, compared to OC-users, performed worse on lag 2 trials (p=0.041). The higher implicit happiness scores of MR-haplotype 2 homozygotes are in line with previous reports. Performance in the attentional blink task may be influenced by OC-use. The MR-genotype moderates the influence of E2 and P4 on emotional information processing. This moderating effect may depend on the novelty of the situation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kujawa, Autumn; Dougherty, Lea; Durbin, C Emily; Laptook, Rebecca; Torpey, Dana; Klein, Daniel N
2014-02-01
Emotion knowledge in childhood has been shown to predict social functioning and psychological well-being, but relatively little is known about parental factors that influence its development in early childhood. There is some evidence that both parenting behavior and maternal depression are associated with emotion recognition, but previous research has only examined these factors independently. The current study assessed auditory and visual emotion recognition ability among a large sample of preschool children to examine typical emotion recognition skills in children of this age, as well as the independent and interactive effects of maternal and paternal depression and negative parenting (i.e., hostility and intrusiveness). Results indicated that children were most accurate at identifying happy emotional expressions. The lowest accuracy was observed for neutral expressions. A significant interaction was found between maternal depression and negative parenting behavior: children with a maternal history of depression were particularly sensitive to the negative effects of maladaptive parenting behavior on emotion recognition ability. No significant effects were found for paternal depression. These results highlight the importance of examining the effects of multiple interacting factors on children's emotional development and provide suggestions for identifying children for targeted preventive interventions.
Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi
2017-09-01
People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p < 0.05) and no significant changes were found in the rest of the facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p < 0.05). Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.
ERIC Educational Resources Information Center
Sawyer, Alyssa C. P.; Williamson, Paul; Young, Robyn
2014-01-01
Deficits in emotion recognition and social interaction characterize individuals with Asperger's Disorder (AS). Moreover they also appear to be less able to accurately use confidence to gauge their emotion recognition accuracy (i.e., metacognitive monitoring). The aim of this study was to extend this finding by considering both monitoring and…
Social emotion recognition, social functioning, and attempted suicide in late-life depression.
Szanto, Katalin; Dombrovski, Alexandre Y; Sahakian, Barbara J; Mulsant, Benoit H; Houck, Patricia R; Reynolds, Charles F; Clark, Luke
2012-03-01
: Lack of feeling connected and poor social problem solving have been described in suicide attempters. However, cognitive substrates of this apparent social impairment in suicide attempters remain unknown. One possible deficit, the inability to recognize others' complex emotional states has been observed not only in disorders characterized by prominent social deficits (autism-spectrum disorders and frontotemporal dementia) but also in depression and normal aging. This study assessed the relationship between social emotion recognition, problem solving, social functioning, and attempted suicide in late-life depression. : There were 90 participants: 24 older depressed suicide attempters, 38 nonsuicidal depressed elders, and 28 comparison subjects with no psychiatric history. We compared performance on the Reading the Mind in the Eyes test and measures of social networks, social support, social problem solving, and chronic interpersonal difficulties in these three groups. : Suicide attempters committed significantly more errors in social emotion recognition and showed poorer global cognitive performance than elders with no psychiatric history. Attempters had restricted social networks: they were less likely to talk to their children, had fewer close friends, and did not engage in volunteer activities, compared to nonsuicidal depressed elders and those with no psychiatric history. They also reported a pattern of struggle against others and hostility in relationships, felt a lack of social support, perceived social problems as impossible to resolve, and displayed a careless/impulsive approach to problems. : Suicide attempts in depressed elders were associated with poor social problem solving, constricted social networks, and disruptive interpersonal relationships. Impaired social emotion recognition in the suicide attempter group was related.
Moghadam, Saeed Montazeri; Seyyedsalehi, Seyyed Ali
2018-05-31
Nonlinear components extracted from deep structures of bottleneck neural networks exhibit a great ability to express input space in a low-dimensional manifold. Sharing and combining the components boost the capability of the neural networks to synthesize and interpolate new and imaginary data. This synthesis is possibly a simple model of imaginations in human brain where the components are expressed in a nonlinear low dimensional manifold. The current paper introduces a novel Dynamic Deep Bottleneck Neural Network to analyze and extract three main features of videos regarding the expression of emotions on the face. These main features are identity, emotion and expression intensity that are laid in three different sub-manifolds of one nonlinear general manifold. The proposed model enjoying the advantages of recurrent networks was used to analyze the sequence and dynamics of information in videos. It is noteworthy to mention that this model also has also the potential to synthesize new videos showing variations of one specific emotion on the face of unknown subjects. Experiments on discrimination and recognition ability of extracted components showed that the proposed model has an average of 97.77% accuracy in recognition of six prominent emotions (Fear, Surprise, Sadness, Anger, Disgust, and Happiness), and 78.17% accuracy in the recognition of intensity. The produced videos revealed variations from neutral to the apex of an emotion on the face of the unfamiliar test subject which is on average 0.8 similar to reference videos in the scale of the SSIM method. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gender differences in emotion recognition: Impact of sensory modality and emotional category.
Lambrecht, Lena; Kreifelts, Benjamin; Wildgruber, Dirk
2014-04-01
Results from studies on gender differences in emotion recognition vary, depending on the types of emotion and the sensory modalities used for stimulus presentation. This makes comparability between different studies problematic. This study investigated emotion recognition of healthy participants (N = 84; 40 males; ages 20 to 70 years), using dynamic stimuli, displayed by two genders in three different sensory modalities (auditory, visual, audio-visual) and five emotional categories. The participants were asked to categorise the stimuli on the basis of their nonverbal emotional content (happy, alluring, neutral, angry, and disgusted). Hit rates and category selection biases were analysed. Women were found to be more accurate in recognition of emotional prosody. This effect was partially mediated by hearing loss for the frequency of 8,000 Hz. Moreover, there was a gender-specific selection bias for alluring stimuli: Men, as compared to women, chose "alluring" more often when a stimulus was presented by a woman as compared to a man.
Gamond, L; Cattaneo, Z
2016-12-01
Consistent evidence suggests that emotional facial expressions are better recognized when the expresser and the perceiver belong to the same social group (in-group advantage). In this study, we used transcranial magnetic stimulation (TMS) to investigate the possible causal involvement of the dorsomedial prefrontal cortex (dmPFC) and of the right temporo-parietal junction (TPJ), two main nodes of the mentalizing neural network, in mediating the in-group advantage in emotion recognition. Participants performed an emotion discrimination task in a minimal (blue/green) group paradigm. We found that interfering with activity in the dmPFC significantly interfered with the effect of minimal group-membership on emotion recognition, reducing participants' ability to discriminate emotions expressed by in-group members. In turn, rTPJ mainly affected emotion discrimination per se, irrespective of group membership. Overall, our results point to a causal role of the dmPFC in mediating the in-group advantage in emotion recognition, favoring intragroup communication. Copyright © 2016 Elsevier Ltd. All rights reserved.
The emotional carryover effect in memory for words.
Schmidt, Stephen R; Schmidt, Constance R
2016-08-01
Emotional material rarely occurs in isolation; rather it is experienced in the spatial and temporal proximity of less emotional items. Some previous researchers have found that emotional stimuli impair memory for surrounding information, whereas others have reported evidence for memory facilitation. Researchers have not determined which types of emotional items or memory tests produce effects that carry over to surrounding items. Six experiments are reported that measured carryover from emotional words varying in arousal to temporally adjacent neutral words. Taboo, non-taboo emotional, and neutral words were compared using different stimulus onset asynchronies (SOAs), recognition and recall tests, and intentional and incidental memory instructions. Strong emotional memory effects were obtained in all six experiments. However, emotional items influenced memory for temporally adjacent words under limited conditions. Words following taboo words were more poorly remembered than words following neutral words when relatively short SOAs were employed. Words preceding taboo words were affected only when recall tests and relatively short retention intervals were used. These results suggest that increased attention to the emotional items sometimes produces emotional carryover effects; however, retrieval processes also contribute to retrograde amnesia and may extend the conditions under which anterograde amnesia is observed.
Martinez, Maria; Multani, Namita; Anor, Cassandra J.; Misquitta, Karen; Tang-Wai, David F.; Keren, Ron; Fox, Susan; Lang, Anthony E.; Marras, Connie; Tartaglia, Maria C.
2018-01-01
Background: Changes in social cognition occur in patients with Alzheimer’s disease (AD) and Parkinson’s disease (PD) and can be caused by several factors, including emotion recognition deficits and neuropsychiatric symptoms (NPS). The aims of this study were to investigate: (1) group differences on emotion detection between patients diagnosed with AD or PD and their respective caregivers; (2) the association of emotion detection with empathetic ability and NPS in individuals with AD or PD; (3) caregivers’ depression and perceived burden in relation to patients’ ability to detect emotions, empathize with others, presence of NPS; and (4) caregiver’s awareness of emotion detection deficits in patients with AD or Parkinson. Methods: In this study, patients with probable AD (N = 25) or PD (N = 17), and their caregivers (N = 42), performed an emotion detection task (The Awareness of Social Inference Test—Emotion Evaluation Test, TASIT-EET). Patients underwent cognitive assessment, using the Behavioral Neurology Assessment (BNA). In addition, caregivers completed questionnaires to measure empathy (Interpersonal Reactivity Index, IRI) and NPS (Neuropsychiatric Inventory, NPI) in patients and self-reported on depression (Geriatric Depression Scale, GDS) and burden (Zarit Burden Interview, ZBI). Caregivers were also interviewed to measure dementia severity (Clinical Dementia Rating (CDR) Scale) in patients. Results: The results suggest that individuals with AD and PD are significantly worse at recognizing emotions than their caregivers. Moreover, caregivers failed to recognize patients’ emotion recognition deficits and this was associated with increased caregiver burden and depression. Patients’ emotion recognition deficits, decreased empathy and NPS were also related to caregiver burden and depression. Conclusions: Changes in emotion detection and empathy in individuals with AD and PD has implications for caregiver burden and depression and may be amenable to interventions with both patients and caregivers. PMID:29740312
Liu, Yi-Hung; Wu, Chien-Te; Cheng, Wei-Teng; Hsiao, Yu-Tsung; Chen, Po-Ming; Teng, Jyh-Tong
2014-01-01
Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods. PMID:25061837
Liu, Yi-Hung; Wu, Chien-Te; Cheng, Wei-Teng; Hsiao, Yu-Tsung; Chen, Po-Ming; Teng, Jyh-Tong
2014-07-24
Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods.
How Deep Neural Networks Can Improve Emotion Recognition on Video Data
2016-09-25
HOW DEEP NEURAL NETWORKS CAN IMPROVE EMOTION RECOGNITION ON VIDEO DATA Pooya Khorrami1 , Tom Le Paine1, Kevin Brady2, Charlie Dagli2, Thomas S...this work, we present a system that per- forms emotion recognition on video data using both con- volutional neural networks (CNNs) and recurrent...neural net- works (RNNs). We present our findings on videos from the Audio/Visual+Emotion Challenge (AV+EC2015). In our experiments, we analyze the effects
Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola
2015-01-01
We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals. PMID:26557101
Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola
2015-01-01
We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.
A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders
Xavier, Jean; Vignaud, Violaine; Ruggiero, Rosa; Bodeau, Nicolas; Cohen, David; Chaby, Laurence
2015-01-01
Although deficits in emotion recognition have been widely reported in autism spectrum disorder (ASD), experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N = 19) and with typical development (TD, N = 19), considering uni (faces and voices) and multimodal (faces/voices simultaneously) stimuli and developmental comorbidities (neuro-visual, language and motor impairments). Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1) the difficulties they experienced with visual stimuli were partially alleviated with multimodal stimuli. (2) Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3) Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory) tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found. We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension. PMID:26733928
Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.
Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J
2012-11-01
Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.
Emotion recognition based on multiple order features using fractional Fourier transform
NASA Astrophysics Data System (ADS)
Ren, Bo; Liu, Deyin; Qi, Lin
2017-07-01
In order to deal with the insufficiency of recently algorithms based on Two Dimensions Fractional Fourier Transform (2D-FrFT), this paper proposes a multiple order features based method for emotion recognition. Most existing methods utilize the feature of single order or a couple of orders of 2D-FrFT. However, different orders of 2D-FrFT have different contributions on the feature extraction of emotion recognition. Combination of these features can enhance the performance of an emotion recognition system. The proposed approach obtains numerous features that extracted in different orders of 2D-FrFT in the directions of x-axis and y-axis, and uses the statistical magnitudes as the final feature vectors for recognition. The Support Vector Machine (SVM) is utilized for the classification and RML Emotion database and Cohn-Kanade (CK) database are used for the experiment. The experimental results demonstrate the effectiveness of the proposed method.
Assessing collective affect recognition via the Emotional Aperture Measure.
Sanchez-Burks, Jeffrey; Bartel, Caroline A; Rees, Laura; Huy, Quy
2016-01-01
Curiosity about collective affect is undergoing a revival in many fields. This literature, tracing back to Le Bon's seminal work on crowd psychology, has established the veracity of collective affect and demonstrated its influence on a wide range of group dynamics. More recently, an interest in the perception of collective affect has emerged, revealing a need for a methodological approach for assessing collective emotion recognition to complement measures of individual emotion recognition. This article addresses this need by introducing the Emotional Aperture Measure (EAM). Three studies provide evidence that collective affect recognition requires a processing style distinct from individual emotion recognition and establishes the validity and reliability of the EAM. A sample of working managers further shows how the EAM provides unique insights into how individuals interact with collectives. We discuss how the EAM can advance several lines of research on collective affect.
Eye-Gaze Analysis of Facial Emotion Recognition and Expression in Adolescents with ASD.
Wieckowski, Andrea Trubanova; White, Susan W
2017-01-01
Impaired emotion recognition and expression in individuals with autism spectrum disorder (ASD) may contribute to observed social impairment. The aim of this study was to examine the role of visual attention directed toward nonsocial aspects of a scene as a possible mechanism underlying recognition and expressive ability deficiency in ASD. One recognition and two expression tasks were administered. Recognition was assessed in force-choice paradigm, and expression was assessed during scripted and free-choice response (in response to emotional stimuli) tasks in youth with ASD (n = 20) and an age-matched sample of typically developing youth (n = 20). During stimulus presentation prior to response in each task, participants' eye gaze was tracked. Youth with ASD were less accurate at identifying disgust and sadness in the recognition task. They fixated less to the eye region of stimuli showing surprise. A group difference was found during the free-choice response task, such that those with ASD expressed emotion less clearly but not during the scripted task. Results suggest altered eye gaze to the mouth region but not the eye region as a candidate mechanism for decreased ability to recognize or express emotion. Findings inform our understanding of the association between social attention and emotion recognition and expression deficits.
Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean
2017-11-01
People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.
An audiovisual emotion recognition system
NASA Astrophysics Data System (ADS)
Han, Yi; Wang, Guoyin; Yang, Yong; He, Kun
2007-12-01
Human emotions could be expressed by many bio-symbols. Speech and facial expression are two of them. They are both regarded as emotional information which is playing an important role in human-computer interaction. Based on our previous studies on emotion recognition, an audiovisual emotion recognition system is developed and represented in this paper. The system is designed for real-time practice, and is guaranteed by some integrated modules. These modules include speech enhancement for eliminating noises, rapid face detection for locating face from background image, example based shape learning for facial feature alignment, and optical flow based tracking algorithm for facial feature tracking. It is known that irrelevant features and high dimensionality of the data can hurt the performance of classifier. Rough set-based feature selection is a good method for dimension reduction. So 13 speech features out of 37 ones and 10 facial features out of 33 ones are selected to represent emotional information, and 52 audiovisual features are selected due to the synchronization when speech and video fused together. The experiment results have demonstrated that this system performs well in real-time practice and has high recognition rate. Our results also show that the work in multimodules fused recognition will become the trend of emotion recognition in the future.
John, Sufna Gheyara; DiLalla, Lisabeth F
2013-09-01
Studies have shown that children and parents provide different reports of children's victimization, with children often reporting more victimization. However, the reason for this differential reporting is unclear. This study explored two types of social biases (emotion recognition and perceived impairment) in parents and children as possible reasons underlying differential reporting. Six- to 10-year-old children and one of their parents were tested in a lab. Testing included subjective measures of parent alexithymic traits, child perceived impairment from victimization, and child- and parent-reported frequency of children's peer victimization and internalizing and externalizing difficulties. Parents and children also completed an objective measure of emotion recognition. Both types of social bias significantly predicted reports of children's peer victimization frequency as well as internalizing and externalizing difficulties, as rated by parents and children. Moreover, child perceived impairment bias, rather than parent emotion bias, best predicted differential reporting of peer victimization. Finally, a significant interaction demonstrated that the influence of child perceived impairment bias on differential reporting was most salient in the presence of parent emotion bias. This underscores the importance of expanding interventions for victimized youth to include the restructuring of social biases.
The Primacy of Perceiving: Emotion Recognition Buffers Negative Effects of Emotional Labor
ERIC Educational Resources Information Center
Bechtoldt, Myriam N.; Rohrmann, Sonja; De Pater, Irene E.; Beersma, Bianca
2011-01-01
There is ample empirical evidence for negative effects of emotional labor (surface acting and deep acting) on workers' well-being. This study analyzed to what extent workers' ability to recognize others' emotions may buffer these effects. In a 4-week study with 85 nurses and police officers, emotion recognition moderated the relationship between…
ERIC Educational Resources Information Center
Tell, Dina; Davidson, Denise
2015-01-01
In this research, the emotion recognition abilities of children with autism spectrum disorder and typically developing children were compared. When facial expressions and situational cues of emotion were congruent, accuracy in recognizing emotions was good for both children with autism spectrum disorder and typically developing children. When…
Fajrianthi; Zein, Rizqy Amelia
2017-01-01
This study aimed to develop an emotional intelligence (EI) test that is suitable to the Indonesian workplace context. Airlangga Emotional Intelligence Test (Tes Kecerdasan Emosi Airlangga [TKEA]) was designed to measure three EI domains: 1) emotional appraisal, 2) emotional recognition, and 3) emotional regulation. TKEA consisted of 120 items with 40 items for each subset. TKEA was developed based on the Situational Judgment Test (SJT) approach. To ensure its psychometric qualities, categorical confirmatory factor analysis (CCFA) and item response theory (IRT) were applied to test its validity and reliability. The study was conducted on 752 participants, and the results showed that test information function (TIF) was 3.414 (ability level = 0) for subset 1, 12.183 for subset 2 (ability level = −2), and 2.398 for subset 3 (level of ability = −2). It is concluded that TKEA performs very well to measure individuals with a low level of EI ability. It is worth to note that TKEA is currently at the development stage; therefore, in this study, we investigated TKEA’s item analysis and dimensionality test of each TKEA subset. PMID:29238234
Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; D'Esposito, Mark; Vinogradov, Sophia
2013-08-30
Both cognitive and social-cognitive deficits impact functional outcome in schizophrenia. Cognitive remediation studies indicate that targeted cognitive and/or social-cognitive training improves behavioral performance on trained skills. However, the neural effects of training in schizophrenia and their relation to behavioral gains are largely unknown. This study tested whether a 50-h intervention which included both cognitive and social-cognitive training would influence neural mechanisms that support social ccognition. Schizophrenia participants completed a computer-based intervention of either auditory-based cognitive training (AT) plus social-cognition training (SCT) (N=11) or non-specific computer games (CG) (N=11). Assessments included a functional magnetic resonance imaging (fMRI) task of facial emotion recognition, and behavioral measures of cognition, social cognition, and functional outcome. The fMRI results showed the predicted group-by-time interaction. Results were strongest for emotion recognition of happy, surprise and fear: relative to CG participants, AT+SCT participants showed a neural activity increase in bilateral amygdala, right putamen and right medial prefrontal cortex. Across all participants, pre-to-post intervention neural activity increase in these regions predicted behavioral improvement on an independent emotion perception measure (MSCEIT: Perceiving Emotions). Among AT+SCT participants alone, neural activity increase in right amygdala predicted behavioral improvement in emotion perception. The findings indicate that combined cognition and social-cognition training improves neural systems that support social-cognition skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo
2012-04-01
The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.
Oldehinkel, Albertine J; Hartman, Catharina A; Van Oort, Floor V A; Nederhof, Esther
2015-01-01
Background Some adolescents function poorly in apparently benign environments, while others thrive despite hassles and difficulties. The aim of this study was to examine if adolescents with specialized skills in the recognition of either positive or negative emotions have a context-dependent risk of developing an anxiety or depressive disorder during adolescence, depending on exposure to positive or harsh parenting. Methods Data came from a large prospective Dutch population study (N = 1539). At age 11, perceived parental rejection and emotional warmth were measured by questionnaire, and emotion recognition skills by means of a reaction-time task. Lifetime diagnoses of anxiety and depressive disorders were assessed at about age 19, using a standardized diagnostic interview. Results Adolescents who were specialized in the recognition of positive emotions had a relatively high probability to develop an anxiety disorder when exposed to parental rejection (Bspecialization*rejection = 0.23, P < 0.01) and a relatively low probability in response to parental emotional warmth (Bspecialization*warmth = −0.24, P = 0.01), while the opposite pattern was found for specialists in negative emotions. The effect of parental emotional warmth on depression onset was likewise modified by emotion recognition specialization (B = −0.13, P = 0.03), but the effect of parental rejection was not (B = 0.02, P = 0.72). In general, the relative advantage of specialists in negative emotions was restricted to fairly uncommon negative conditions. Conclusions Our results suggest that there is no unequivocal relation between parenting behaviors and the probability to develop an anxiety or depressive disorder in adolescence, and that emotion recognition specialization may be a promising way to distinguish between various types of context-dependent reaction patterns. PMID:25642389
Oldehinkel, Albertine J; Hartman, Catharina A; Van Oort, Floor V A; Nederhof, Esther
2015-02-01
Some adolescents function poorly in apparently benign environments, while others thrive despite hassles and difficulties. The aim of this study was to examine if adolescents with specialized skills in the recognition of either positive or negative emotions have a context-dependent risk of developing an anxiety or depressive disorder during adolescence, depending on exposure to positive or harsh parenting. Data came from a large prospective Dutch population study (N = 1539). At age 11, perceived parental rejection and emotional warmth were measured by questionnaire, and emotion recognition skills by means of a reaction-time task. Lifetime diagnoses of anxiety and depressive disorders were assessed at about age 19, using a standardized diagnostic interview. Adolescents who were specialized in the recognition of positive emotions had a relatively high probability to develop an anxiety disorder when exposed to parental rejection (Bspecialization*rejection = 0.23, P < 0.01) and a relatively low probability in response to parental emotional warmth (Bspecialization*warmth = -0.24, P = 0.01), while the opposite pattern was found for specialists in negative emotions. The effect of parental emotional warmth on depression onset was likewise modified by emotion recognition specialization (B = -0.13, P = 0.03), but the effect of parental rejection was not (B = 0.02, P = 0.72). In general, the relative advantage of specialists in negative emotions was restricted to fairly uncommon negative conditions. Our results suggest that there is no unequivocal relation between parenting behaviors and the probability to develop an anxiety or depressive disorder in adolescence, and that emotion recognition specialization may be a promising way to distinguish between various types of context-dependent reaction patterns.
Liu, Xinyang; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Cai, Xinxia; Wilhelm, Oliver
2017-01-01
Facial identity and facial expression processing are crucial socio-emotional abilities but seem to show only limited psychometric uniqueness when the processing speed is considered in easy tasks. We applied a comprehensive measurement of processing speed and contrasted performance specificity in socio-emotional, social and non-social stimuli from an individual differences perspective. Performance in a multivariate task battery could be best modeled by a general speed factor and a first-order factor capturing some specific variance due to processing emotional facial expressions. We further tested equivalence of the relationships between speed factors and polymorphisms of dopamine and serotonin transporter genes. Results show that the speed factors are not only psychometrically equivalent but invariant in their relation with the Catechol-O-Methyl-Transferase (COMT) Val158Met polymorphism. However, the 5-HTTLPR/rs25531 serotonin polymorphism was related with the first-order factor of emotion perception speed, suggesting a specific genetic correlate of processing emotions. We further investigated the relationship between several components of event-related brain potentials with psychometric abilities, and tested emotion specific individual differences at the neurophysiological level. Results revealed swifter emotion perception abilities to go along with larger amplitudes of the P100 and the Early Posterior Negativity (EPN), when emotion processing was modeled on its own. However, after partialling out the shared variance of emotion perception speed with general processing speed-related abilities, brain-behavior relationships did not remain specific for emotion. Together, the present results suggest that speed abilities are strongly interrelated but show some specificity for emotion processing speed at the psychometric level. At both genetic and neurophysiological levels, emotion specificity depended on whether general cognition is taken into account or not. These findings keenly suggest that general speed abilities should be taken into account when the study of emotion recognition abilities is targeted in its specificity. PMID:28848411
Children with autism spectrum disorder are skilled at reading emotion body language.
Peterson, Candida C; Slaughter, Virginia; Brownell, Celia
2015-11-01
Autism is commonly believed to impair the ability to perceive emotions, yet empirical evidence is mixed. Because face processing may be difficult for those with autism spectrum disorder (ASD), we developed a novel test of recognizing emotion via static body postures (Body-Emotion test) and evaluated it with children aged 5 to 12 years in two studies. In Study 1, 34 children with ASD and 41 typically developing (TD) controls matched for age and verbal intelligence (VIQ [verbal IQ]) were tested on (a) our new Body-Emotion test, (b) a widely used test of emotion recognition using photos of eyes as stimuli (Baron-Cohen et al.'s "Reading Mind in the Eyes: Child" or RMEC [Journal of Developmental and Learning Disorders, 2001, Vol. 5, pp. 47-78]), (c) a well-validated theory of mind (ToM) battery, and (d) a teacher-rated empathy scale. In Study 2 (33 children with ASD and 31 TD controls), the RMEC test was simplified to the six basic human emotions. Results of both studies showed that children with ASD performed as well as their TD peers on the Body-Emotion test. Yet TD children outperformed the ASD group on ToM and on both the standard RMEC test and the simplified version. VIQ was not related to perceiving emotions via either body posture or eyes for either group. However, recognizing emotions from body posture was correlated with ToM, especially for children with ASD. Finally, reading emotions from body posture was easier than reading emotions from eyes for both groups. Copyright © 2015 Elsevier Inc. All rights reserved.
Emotion Recognition in Fathers and Mothers at High-Risk for Child Physical Abuse
ERIC Educational Resources Information Center
Asla, Nagore; de Paul, Joaquin; Perez-Albeniz, Alicia
2011-01-01
Objective: The present study was designed to determine whether parents at high risk for physical child abuse, in comparison with parents at low risk, show deficits in emotion recognition, as well as to examine the moderator effect of gender and stress on the relationship between risk for physical child abuse and emotion recognition. Methods: Based…
ERIC Educational Resources Information Center
Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.
2013-01-01
Research Findings: The study examined children's recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children ("N" = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills…
ERIC Educational Resources Information Center
Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.
2013-01-01
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…
On the Time Course of Vocal Emotion Recognition
Pell, Marc D.; Kotz, Sonja A.
2011-01-01
How quickly do listeners recognize emotions from a speaker's voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing. PMID:22087275
Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques
2013-09-01
"Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. bottom-up and top-down) on emotional memory, we conducted two studies in healthy and traumatized adolescents, a period of life in which vulnerability to emotion is particularly high. Using negative and neutral images selected from the international affective picture system (IAPS), stimuli were divided into perceptual images (emotion generated by perceptual details) and conceptual images (emotion generated by the general meaning of the material). Both categories of stimuli were then used, along with neutral pictures, in a memory task with two phases (encoding and recognition). In both populations, we reported a differential effect of the emotional material on encoding and recognition. Negative perceptual scenes induced an attentional capture effect during encoding and enhanced the recollective distinctiveness. Conversely, the encoding of conceptual scenes was similar to neutral ones, but the conceptual relatedness induced false memories at retrieval. However, among individuals with PTSD, two subgroups of patients were identified. The first subgroup processed the scenes faster than controls, except for the perceptual scenes, and obtained similar performances to controls in the recognition task. The second subgroup group desmonstrated an attentional deficit in the encoding task with no benefit from the distinctiveness associated with negative perceptual scenes on memory performances. These findings provide a new perspective on how negative emotional information may have opposite influences on memory in normal and traumatized individuals. It also gives clues to understand how intrusive memories and overgeneralization takes place in PTSD. Copyright © 2013 Elsevier Ltd. All rights reserved.
Impaired perception of facial emotion in developmental prosopagnosia.
Biotti, Federica; Cook, Richard
2016-08-01
Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stability of facial emotion recognition performance in bipolar disorder.
Martino, Diego J; Samamé, Cecilia; Strejilevich, Sergio A
2016-09-30
The aim of this study was to assess the performance in emotional processing over time in a sample of euthymic patients with bipolar disorder (BD). Performance in the facial recognition of the six basic emotions (surprise, anger, sadness, happiness, disgust, and fear) did not change during a follow-up period of almost 7 years. These preliminary results suggest that performance in facial emotion recognition might be stable over time in BD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
2015-05-28
recognition is simpler and requires less computational resources compared to other inputs such as facial expressions . The Berlin database of Emotional ...Processing Magazine, IEEE, vol. 18, no. 1, pp. 32– 80, 2001. [15] K. R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal expression of emotion ...Network for Real-Time Speech- Emotion Recognition 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S) Q
Vogel, Bastian D; Brück, Carolin; Jacob, Heike; Eberle, Mark; Wildgruber, Dirk
2016-07-07
Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore, future studies evaluating perception of nonverbal cues should consider a broader range of social and emotional signals beyond basic emotions including attitudes and interpersonal intentions. Identifying specific domains of social perception particularly prone for misunderstandings in patients with schizophrenia might allow for a refinement of interventions aiming at improving social functioning.
Ameller, Aurely; Picard, Aline; D'Hondt, Fabien; Vaiva, Guillaume; Thomas, Pierre; Pins, Delphine
2017-01-01
Familiarity is a subjective sensation that contributes to person recognition. This process is described as an emotion-based memory-trace of previous meetings and could be disrupted in schizophrenia. Consequently, familiarity disorders could be involved in the impaired social interactions observed in patients with schizophrenia. Previous studies have primarily focused on famous people recognition. Our aim was to identify underlying features, such as emotional disturbances, that may contribute to familiarity disorders in schizophrenia. We hypothesize that patients with familiarity disorders will exhibit a lack of familiarity that could be detected by a flattened skin conductance response (SCR). The SCR was recorded to test the hypothesis that emotional reactivity disturbances occur in patients with schizophrenia during the categorization of specific familiar, famous and unknown faces as male or female. Forty-eight subjects were divided into the following 3 matched groups with 16 subjects per group: control subjects, schizophrenic people with familiarity disorder, and schizophrenic people without familiarity disorders. Emotional arousal is reflected by the skin conductance measures. The control subjects and the patients without familiarity disorders experienced a differential emotional response to the specific familiar faces compared with that to the unknown faces. Nevertheless, overall, the schizophrenic patients without familiarity disorders showed a weaker response across conditions compared with the control subjects. In contrast, the patients with familiarity disorders did not show any significant differences in their emotional response to the faces, regardless of the condition. Only patients with familiarity disorders fail to exhibit a difference in emotional response between familiar and non-familiar faces. These patients likely emotionally process familiar faces similarly to unknown faces. Hence, the lower feelings of familiarity in schizophrenia may be a premise enabling the emergence of familiarity disorders.
Emotional recognition from the speech signal for a virtual education agent
NASA Astrophysics Data System (ADS)
Tickle, A.; Raghu, S.; Elshaw, M.
2013-06-01
This paper explores the extraction of features from the speech wave to perform intelligent emotion recognition. A feature extract tool (openSmile) was used to obtain a baseline set of 998 acoustic features from a set of emotional speech recordings from a microphone. The initial features were reduced to the most important ones so recognition of emotions using a supervised neural network could be performed. Given that the future use of virtual education agents lies with making the agents more interactive, developing agents with the capability to recognise and adapt to the emotional state of humans is an important step.
Gramaglia, Carla; Ressico, Francesca; Gambaro, Eleonora; Palazzolo, Anna; Mazzarino, Massimiliano; Bert, Fabrizio; Siliquini, Roberta; Zeppegno, Patrizia
2016-08-01
Alexithymia, difficulties in facial emotion recognition, poor socio-relational skills are typical of anorexia nervosa (AN). We assessed patients with AN and healthy controls (HCs) with mixed stimuli: questionnaires (Toronto Alexithymia Scale-TAS, Interpersonal Reactivity Index-IRI), photographs (Facial Emotion Identification Test-FEIT) and dynamic images (The Awareness of Social Inference Test-TASIT). TAS and IRI Personal Distress (PD) were higher in AN than HCs. Few or no differences emerged at the FEIT and TASIT, respectively. Larger effect sizes were found for the TAS results. Despite higher levels of alexithymia, patients with AN seem to properly acknowledge others' emotions while being inhibited in the expression of their own. Copyright © 2016 Elsevier Ltd. All rights reserved.
Altered brain mechanisms of emotion processing in pre-manifest Huntington's disease.
Novak, Marianne J U; Warren, Jason D; Henley, Susie M D; Draganski, Bogdan; Frackowiak, Richard S; Tabrizi, Sarah J
2012-04-01
Huntington's disease is an inherited neurodegenerative disease that causes motor, cognitive and psychiatric impairment, including an early decline in ability to recognize emotional states in others. The pathophysiology underlying the earliest manifestations of the disease is not fully understood; the objective of our study was to clarify this. We used functional magnetic resonance imaging to investigate changes in brain mechanisms of emotion recognition in pre-manifest carriers of the abnormal Huntington's disease gene (subjects with pre-manifest Huntington's disease): 16 subjects with pre-manifest Huntington's disease and 14 control subjects underwent 1.5 tesla magnetic resonance scanning while viewing pictures of facial expressions from the Ekman and Friesen series. Disgust, anger and happiness were chosen as emotions of interest. Disgust is the emotion in which recognition deficits have most commonly been detected in Huntington's disease; anger is the emotion in which impaired recognition was detected in the largest behavioural study of emotion recognition in pre-manifest Huntington's disease to date; and happiness is a positive emotion to contrast with disgust and anger. Ekman facial expressions were also used to quantify emotion recognition accuracy outside the scanner and structural magnetic resonance imaging with voxel-based morphometry was used to assess the relationship between emotion recognition accuracy and regional grey matter volume. Emotion processing in pre-manifest Huntington's disease was associated with reduced neural activity for all three emotions in partially separable functional networks. Furthermore, the Huntington's disease-associated modulation of disgust and happiness processing was negatively correlated with genetic markers of pre-manifest disease progression in distributed, largely extrastriatal networks. The modulated disgust network included insulae, cingulate cortices, pre- and postcentral gyri, precunei, cunei, bilateral putamena, right pallidum, right thalamus, cerebellum, middle frontal, middle occipital, right superior and left inferior temporal gyri, and left superior parietal lobule. The modulated happiness network included postcentral gyri, left caudate, right cingulate cortex, right superior and inferior parietal lobules, and right superior frontal, middle temporal, middle occipital and precentral gyri. These effects were not driven merely by striatal dysfunction. We did not find equivalent associations between brain structure and emotion recognition, and the pre-manifest Huntington's disease cohort did not have a behavioural deficit in out-of-scanner emotion recognition relative to controls. In addition, we found increased neural activity in the pre-manifest subjects in response to all three emotions in frontal regions, predominantly in the middle frontal gyri. Overall, these findings suggest that pathophysiological effects of Huntington's disease may precede the development of overt clinical symptoms and detectable cerebral atrophy.
Labelling Facial Affect in Context in Adults with and without TBI
Turkstra, Lyn S.; Kraning, Sarah G.; Riedeman, Sarah K.; Mutlu, Bilge; Duff, Melissa; VanDenHeuvel, Sara
2017-01-01
Recognition of facial affect has been studied extensively in adults with and without traumatic brain injury (TBI), mostly by asking examinees to match basic emotion words to isolated faces. This method may not capture affect labelling in everyday life when faces are in context and choices are open-ended. To examine effects of context and response format, we asked 148 undergraduate students to label emotions shown on faces either in isolation or in natural visual scenes. Responses were categorised as representing basic emotions, social emotions, cognitive state terms, or appraisals. We used students’ responses to create a scoring system that was applied prospectively to five men with TBI. In both groups, over 50% of responses were neither basic emotion words nor synonyms, and there was no significant difference in response types between faces alone vs. in scenes. Adults with TBI used labels not seen in students’ responses, talked more overall, and often gave multiple labels for one photo. Results suggest benefits of moving beyond forced-choice tests of faces in isolation to fully characterise affect recognition in adults with and without TBI. PMID:29093643
Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain.
Zhuang, Ning; Zeng, Ying; Tong, Li; Zhang, Chi; Zhang, Hanming; Yan, Bin
2017-01-01
This paper introduces a method for feature extraction and emotion recognition based on empirical mode decomposition (EMD). By using EMD, EEG signals are decomposed into Intrinsic Mode Functions (IMFs) automatically. Multidimensional information of IMF is utilized as features, the first difference of time series, the first difference of phase, and the normalized energy. The performance of the proposed method is verified on a publicly available emotional database. The results show that the three features are effective for emotion recognition. The role of each IMF is inquired and we find that high frequency component IMF1 has significant effect on different emotional states detection. The informative electrodes based on EMD strategy are analyzed. In addition, the classification accuracy of the proposed method is compared with several classical techniques, including fractal dimension (FD), sample entropy, differential entropy, and discrete wavelet transform (DWT). Experiment results on DEAP datasets demonstrate that our method can improve emotion recognition performance.
Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals.
Zhuang, Ning; Zeng, Ying; Yang, Kai; Zhang, Chi; Tong, Li; Yan, Bin
2018-03-12
Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods.
Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals
Zeng, Ying; Yang, Kai; Tong, Li; Yan, Bin
2018-01-01
Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods. PMID:29534515
Parents' Emotion-Related Beliefs, Behaviours, and Skills Predict Children's Recognition of Emotion
ERIC Educational Resources Information Center
Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.
2015-01-01
Children who are able to recognize others' emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents' own emotion-related beliefs,…
ERIC Educational Resources Information Center
Williams, Beth T.; Gray, Kylie M.; Tonge, Bruce J.
2012-01-01
Background: Children with autism have difficulties in emotion recognition and a number of interventions have been designed to target these problems. However, few emotion training interventions have been trialled with young children with autism and co-morbid ID. This study aimed to evaluate the efficacy of an emotion training programme for a group…
A voxel-based lesion study on facial emotion recognition after penetrating brain injury
Dal Monte, Olga; Solomon, Jeffrey M.; Schintu, Selene; Knutson, Kristine M.; Strenziok, Maren; Pardini, Matteo; Leopold, Anne; Raymont, Vanessa; Grafman, Jordan
2013-01-01
The ability to read emotions in the face of another person is an important social skill that can be impaired in subjects with traumatic brain injury (TBI). To determine the brain regions that modulate facial emotion recognition, we conducted a whole-brain analysis using a well-validated facial emotion recognition task and voxel-based lesion symptom mapping (VLSM) in a large sample of patients with focal penetrating TBIs (pTBIs). Our results revealed that individuals with pTBI performed significantly worse than normal controls in recognizing unpleasant emotions. VLSM mapping results showed that impairment in facial emotion recognition was due to damage in a bilateral fronto-temporo-limbic network, including medial prefrontal cortex (PFC), anterior cingulate cortex, left insula and temporal areas. Beside those common areas, damage to the bilateral and anterior regions of PFC led to impairment in recognizing unpleasant emotions, whereas bilateral posterior PFC and left temporal areas led to impairment in recognizing pleasant emotions. Our findings add empirical evidence that the ability to read pleasant and unpleasant emotions in other people's faces is a complex process involving not only a common network that includes bilateral fronto-temporo-limbic lobes, but also other regions depending on emotional valence. PMID:22496440
Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long
2012-12-30
The Chinese Facial Emotion Recognition Database (CFERD), a computer-generated three-dimensional (3D) paradigm, was developed to measure the recognition of facial emotional expressions at different intensities. The stimuli consisted of 3D colour photographic images of six basic facial emotional expressions (happiness, sadness, disgust, fear, anger and surprise) and neutral faces of the Chinese. The purpose of the present study is to describe the development and validation of CFERD with nonclinical healthy participants (N=100; 50 men; age ranging between 18 and 50 years), and to generate normative data set. The results showed that the sensitivity index d' [d'=Z(hit rate)-Z(false alarm rate), where function Z(p), p∈[0,1
Deceived, Disgusted, and Defensive: Motivated Processing of Anti-Tobacco Advertisements.
Leshner, Glenn; Clayton, Russell B; Bolls, Paul D; Bhandari, Manu
2017-08-29
A 2 × 2 experiment was conducted, where participants watched anti-tobacco messages that varied in deception (content portraying tobacco companies as dishonest) and disgust (negative graphic images) content. Psychophysiological measures, self-report, and a recognition test were used to test hypotheses generated from the motivated cognition framework. The results of this study indicate that messages containing both deception and disgust push viewers into a cascade of defensive responses reflected by increased self-reported unpleasantness, reduced resources allocated to encoding, worsened recognition memory, and dampened emotional responses compared to messages depicting one attribute or neither. Findings from this study demonstrate the value of applying a motivated cognition theoretical framework in research on responses to emotional content in health messages and support previous research on defensive processing and message design of anti-tobacco messages.
Narme, Pauline; Mouras, Harold; Roussel, Martine; Duru, Cécile; Krystkowiak, Pierre; Godefroy, Olivier
2013-03-01
Parkinson's disease (PD) is associated with behavioral disorders that can affect social functioning but are poorly understood. Since emotional and cognitive social processes are known to be crucial in social relationships, impairment of these processes may account for the emergence of behavioral disorders. We used a systematic battery of tests to assess emotional processes and social cognition in PD patients and relate our findings to conventional neuropsychological data (especially behavioral disorders). Twenty-three PD patients and 46 controls (matched for age and educational level) were included in the study and underwent neuropsychological testing, including an assessment of the behavioral and cognitive components of executive function. Emotional and cognitive social processes were assessed with the Interpersonal Reactivity Index caregiver-administered questionnaire (as a measure of empathy), a facial emotion recognition task and two theory of mind (ToM) tasks. When compared with controls, PD patients showed low levels of empathy (p = .006), impaired facial emotion recognition (which persisted after correction for perceptual abilities) (p = .001), poor performance in a second-order ToM task (p = .008) that assessed both cognitive (p = .004) and affective (p = .03) inferences and, lastly, frequent dysexecutive behavioral disorders (in over 40% of the patients). Overall, impaired emotional and cognitive social functioning was observed in 17% of patients and was related to certain cognitive dysexecutive disorders. In terms of behavioral dysexecutive disorders, social behavior disorders were related to impaired emotional and cognitive social functioning (p = .04) but were independent of cognitive impairments. Emotional and cognitive social processes were found to be impaired in Parkinson's disease. This impairment may account for the emergence of social behavioral disorders. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Windmann, Sabine; Hill, Holger
2014-10-01
Performance on tasks requiring discrimination of at least two stimuli can be viewed either from an objective perspective (referring to actual stimulus differences), or from a subjective perspective (corresponding to participant's responses). Using event-related potentials recorded during an old/new recognition memory test involving emotionally laden and neutral words studied either blockwise or randomly intermixed, we show here how the objective perspective (old versus new items) yields late effects of blockwise emotional item presentation at parietal sites that the subjective perspective fails to find, whereas the subjective perspective ("old" versus "new" responses) is more sensitive to early effects of emotion at anterior sites than the objective perspective. Our results demonstrate the potential advantage of dissociating the subjective and the objective perspective onto task performance (in addition to analyzing trials with correct responses), especially for investigations of illusions and information processing biases, in behavioral and cognitive neuroscience studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Salience of the lambs: a test of the saliency map hypothesis with pictures of emotive objects.
Humphrey, Katherine; Underwood, Geoffrey; Lambert, Tony
2012-01-25
Humans have an ability to rapidly detect emotive stimuli. However, many emotional objects in a scene are also highly visually salient, which raises the question of how dependent the effects of emotionality are on visual saliency and whether the presence of an emotional object changes the power of a more visually salient object in attracting attention. Participants were shown a set of positive, negative, and neutral pictures and completed recall and recognition memory tests. Eye movement data revealed that visual saliency does influence eye movements, but the effect is reliably reduced when an emotional object is present. Pictures containing negative objects were recognized more accurately and recalled in greater detail, and participants fixated more on negative objects than positive or neutral ones. Initial fixations were more likely to be on emotional objects than more visually salient neutral ones, suggesting that the processing of emotional features occurs at a very early stage of perception.
Pistoia, Francesca; Carolei, Antonio; Sacco, Simona; Conson, Massimiliano; Pistarini, Caterina; Cazzulani, Benedetta; Stewart, Janet; Franceschini, Marco; Sarà, Marco
2015-12-15
There is much evidence to suggest that recognizing and sharing emotions with others require a first-hand experience of those emotions in our own body which, in turn, depends on the adequate perception of our own internal state (interoception) through preserved sensory pathways. Here we explored the contribution of interoception to first-hand emotional experiences and to the recognition of others' emotions. For this aim, 10 individuals with sensory deafferentation as a consequence of high spinal cord injury (SCI; five males and five females; mean age, 48 ± 14.8 years) and 20 healthy subjects matched for age, sex, and education were included in the study. Recognition of facial expressions and judgment of emotionally evocative scenes were investigated in both groups using the Ekman and Friesen set of Pictures of Facial Affect and the International Affective Picture System. A two-way mixed analysis of variance and post hoc comparisons were used to test differences among emotions and groups. Compared with healthy subjects, individuals with SCI, when asked to judge emotionally evocative scenes, had difficulties in judging their own emotional response to complex scenes eliciting fear and anger, while they were able to recognize the same emotions when conveyed by facial expressions. Our findings endorse a simulative view of emotional processing according to which the proper perception of our own internal state (interoception), through preserved sensory pathways, is crucial for first-hand experiences of the more primordial emotions, such as fear and anger.
Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease
Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul
2016-01-01
According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393
Lu, Lingxi; Bao, Xiaohan; Chen, Jing; Qu, Tianshu; Wu, Xihong; Li, Liang
2018-05-01
Under a noisy "cocktail-party" listening condition with multiple people talking, listeners can use various perceptual/cognitive unmasking cues to improve recognition of the target speech against informational speech-on-speech masking. One potential unmasking cue is the emotion expressed in a speech voice, by means of certain acoustical features. However, it was unclear whether emotionally conditioning a target-speech voice that has none of the typical acoustical features of emotions (i.e., an emotionally neutral voice) can be used by listeners for enhancing target-speech recognition under speech-on-speech masking conditions. In this study we examined the recognition of target speech against a two-talker speech masker both before and after the emotionally neutral target voice was paired with a loud female screaming sound that has a marked negative emotional valence. The results showed that recognition of the target speech (especially the first keyword in a target sentence) was significantly improved by emotionally conditioning the target speaker's voice. Moreover, the emotional unmasking effect was independent of the unmasking effect of the perceived spatial separation between the target speech and the masker. Also, (skin conductance) electrodermal responses became stronger after emotional learning when the target speech and masker were perceptually co-located, suggesting an increase of listening efforts when the target speech was informationally masked. These results indicate that emotionally conditioning the target speaker's voice does not change the acoustical parameters of the target-speech stimuli, but the emotionally conditioned vocal features can be used as cues for unmasking target speech.
Emotional System for Military Target Identification
2009-10-01
algorithm [23], and used it to solve a facial recognition problem. In other works [24,25], we explored the potential of using emotional neural...other application areas, such as security ( facial recognition ) and medical (blood cell identification), can be also efficiently used in military...Application of an emotional neural network to facial recognition . Neural Computing and Applications, 18(4), 309-320. [25] Khashman, A. (2009). Blood cell
Early effects of duloxetine on emotion recognition in healthy volunteers
Bamford, Susan; Penton-Voak, Ian; Pinkney, Verity; Baldwin, David S; Munafò, Marcus R; Garner, Matthew
2015-01-01
The serotonin-noradrenaline reuptake inhibitor (SNRI) duloxetine is an effective treatment for major depression and generalised anxiety disorder. Neuropsychological models of antidepressant drug action suggest therapeutic effects might be mediated by the early correction of maladaptive biases in emotion processing, including the recognition of emotional expressions. Sub-chronic administration of duloxetine (for two weeks) produces adaptive changes in neural circuitry implicated in emotion processing; however, its effects on emotional expression recognition are unknown. Forty healthy participants were randomised to receive either 14 days of duloxetine (60 mg/day, titrated from 30 mg after three days) or matched placebo (with sham titration) in a double-blind, between-groups, repeated-measures design. On day 0 and day 14 participants completed a computerised emotional expression recognition task that measured sensitivity to the six primary emotions. Thirty-eight participants (19 per group) completed their course of tablets and were included in the analysis. Results provide evidence that duloxetine, compared to placebo, may reduce the accurate recognition of sadness. Drug effects were driven by changes in participants’ ability to correctly detect subtle expressions of sadness, with greater change observed in the placebo relative to the duloxetine group. These effects occurred in the absence of changes in mood. Our preliminary findings require replication, but complement recent evidence that sadness recognition is a therapeutic target in major depression, and a mechanism through which SNRIs could resolve negative biases in emotion processing to achieve therapeutic effects. PMID:25759400
Monetary incentives at retrieval promote recognition of involuntarily learned emotional information.
Yan, Chunping; Li, Yunyun; Zhang, Qin; Cui, Lixia
2018-03-07
Previous studies have suggested that the effects of reward on memory processes are affected by certain factors, but it remains unclear whether the effects of reward at retrieval on recognition processes are influenced by emotion. The event-related potential was used to investigate the combined effect of reward and emotion on memory retrieval and its neural mechanism. The behavioral results indicated that the reward at retrieval improved recognition performance under positive and negative emotional conditions. The event-related potential results indicated that there were significant interactions between the reward and emotion in the average amplitude during recognition, and the significant reward effects from the frontal to parietal brain areas appeared at 130-800 ms for positive pictures and at 190-800 ms for negative pictures, but there were no significant reward effects of neutral pictures; the reward effect of positive items appeared relatively earlier, starting at 130 ms, and that of negative pictures began at 190 ms. These results indicate that monetary incentives at retrieval promote recognition of involuntarily learned emotional information.
Emotion Recognition in Face and Body Motion in Bulimia Nervosa.
Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate
2017-11-01
Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.
Recognition of emotion from body language among patients with unipolar depression
Loi, Felice; Vaidya, Jatin G.; Paradiso, Sergio
2013-01-01
Major depression may be associated with abnormal perception of emotions and impairment in social adaptation. Emotion recognition from body language and its possible implications to social adjustment have not been examined in patients with depression. Three groups of participants (51 with depression; 68 with history of depression in remission; and 69 never depressed healthy volunteers) were compared on static and dynamic tasks of emotion recognition from body language. Psychosocial adjustment was assessed using the Social Adjustment Scale Self-Report (SAS-SR). Participants with current depression showed reduced recognition accuracy for happy stimuli across tasks relative to remission and comparison participants. Participants with depression tended to show poorer psychosocial adaptation relative to remission and comparison groups. Correlations between perception accuracy of happiness and scores on the SAS-SR were largely not significant. These results indicate that depression is associated with reduced ability to appraise positive stimuli of emotional body language but emotion recognition performance is not tied to social adjustment. These alterations do not appear to be present in participants in remission suggesting state-like qualities. PMID:23608159
Rupp, Claudia I; Derntl, Birgit; Osthaus, Friederike; Kemmler, Georg; Fleischhacker, W Wolfgang
2017-12-01
Despite growing evidence for neurobehavioral deficits in social cognition in alcohol use disorder (AUD), the clinical relevance remains unclear, and little is known about its impact on treatment outcome. This study prospectively investigated the impact of neurocognitive social abilities at treatment onset on treatment completion. Fifty-nine alcohol-dependent patients were assessed with measures of social cognition including 3 core components of empathy via paradigms measuring: (i) emotion recognition (the ability to recognize emotions via facial expression), (ii) emotional perspective taking, and (iii) affective responsiveness at the beginning of inpatient treatment for alcohol dependence. Subjective measures were also obtained, including estimates of task performance and a self-report measure of empathic abilities (Interpersonal Reactivity Index). According to treatment outcomes, patients were divided into a patient group with a regular treatment course (e.g., with planned discharge and without relapse during treatment) or an irregular treatment course (e.g., relapse and/or premature and unplanned termination of treatment, "dropout"). Compared with patients completing treatment in a regular fashion, patients with relapse and/or dropout of treatment had significantly poorer facial emotion recognition ability at treatment onset. Additional logistic regression analyses confirmed these results and identified poor emotion recognition performance as a significant predictor for relapse/dropout. Self-report (subjective) measures did not correspond with neurobehavioral social cognition measures, respectively objective task performance. Analyses of individual subtypes of facial emotions revealed poorer recognition particularly of disgust, anger, and no (neutral faces) emotion in patients with relapse/dropout. Social cognition in AUD is clinically relevant. Less successful treatment outcome was associated with poorer facial emotion recognition ability at the beginning of treatment. Impaired facial emotion recognition represents a neurocognitive risk factor that should be taken into account in alcohol dependence treatment. Treatments targeting the improvement of these social cognition deficits in AUD may offer a promising future approach. Copyright © 2017 by the Research Society on Alcoholism.
Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L
2018-02-01
Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p < .001). After psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.
Simpson, Claire; Pinkham, Amy E; Kelsven, Skylar; Sasson, Noah J
2013-12-01
Emotion can be expressed by both the voice and face, and previous work suggests that presentation modality may impact emotion recognition performance in individuals with schizophrenia. We investigated the effect of stimulus modality on emotion recognition accuracy and the potential role of visual attention to faces in emotion recognition abilities. Thirty-one patients who met DSM-IV criteria for schizophrenia (n=8) or schizoaffective disorder (n=23) and 30 non-clinical control individuals participated. Both groups identified emotional expressions in three different conditions: audio only, visual only, combined audiovisual. In the visual only and combined conditions, time spent visually fixating salient features of the face were recorded. Patients were significantly less accurate than controls in emotion recognition during both the audio and visual only conditions but did not differ from controls on the combined condition. Analysis of visual scanning behaviors demonstrated that patients attended less than healthy individuals to the mouth in the visual condition but did not differ in visual attention to salient facial features in the combined condition, which may in part explain the absence of a deficit for patients in this condition. Collectively, these findings demonstrate that patients benefit from multimodal stimulus presentations of emotion and support hypotheses that visual attention to salient facial features may serve as a mechanism for accurate emotion identification. © 2013.
Influence of gender in the recognition of basic facial expressions: A critical literature review
Forni-Santos, Larissa; Osório, Flávia L
2015-01-01
AIM: To conduct a systematic literature review about the influence of gender on the recognition of facial expressions of six basic emotions. METHODS: We made a systematic search with the search terms (face OR facial) AND (processing OR recognition OR perception) AND (emotional OR emotion) AND (gender or sex) in PubMed, PsycINFO, LILACS, and SciELO electronic databases for articles assessing outcomes related to response accuracy and latency and emotional intensity. The articles selection was performed according to parameters set by COCHRANE. The reference lists of the articles found through the database search were checked for additional references of interest. RESULTS: In respect to accuracy, women tend to perform better than men when all emotions are considered as a set. Regarding specific emotions, there seems to be no gender-related differences in the recognition of happiness, whereas results are quite heterogeneous in respect to the remaining emotions, especially sadness, anger, and disgust. Fewer articles dealt with the parameters of response latency and emotional intensity, which hinders the generalization of their findings, especially in the face of their methodological differences. CONCLUSION: The analysis of the studies conducted to date do not allow for definite conclusions concerning the role of the observer’s gender in the recognition of facial emotion, mostly because of the absence of standardized methods of investigation. PMID:26425447
ERIC Educational Resources Information Center
Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki
2013-01-01
This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…
Multimodal approaches for emotion recognition: a survey
NASA Astrophysics Data System (ADS)
Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.
2004-12-01
Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.
Multimodal approaches for emotion recognition: a survey
NASA Astrophysics Data System (ADS)
Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.
2005-01-01
Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.
Bat-Pitault, F; Da Fonseca, D; Flori, S; Porcher-Guinet, V; Stagnara, C; Patural, H; Franco, P; Deruelle, C
2017-10-01
The emotional process is characterized by a negative bias in depression, thus it was legitimate to establish if they same is true in very young at-risk children. Furthermore, sleep, also proposed as a marker of the depression risk, is closely linked in adults and adolescents with emotions. That is why we wanted first to better describe the characteristics of emotional recognition by 3-year-olds and their links with sleep. Secondly we observed, if found at this young age, an emotional recognition pattern indicating a vulnerability to depression. We studied, in 133 children aged 36 months from the AuBE cohort, the number of correct answers to the task of recognition of facial emotions (joy, anger and sadness). Cognitive functions were also assessed by the WPPSI III at 3 years old, and the different sleep parameters (time of light off and light on, sleep times, difficulty to go to sleep and number of parents' awakes per night) were described by questionnaires filled out by mothers at 6, 12, 18, 24 and 36 months after birth. Of these 133 children, 21 children whose mothers had at least one history of depression (13 boys) were the high-risk group and 19 children (8 boys) born to women with no history of depression were the low-risk group (or control group). Overall, 133 children by the age of 36 months recognize significantly better happiness than other emotions (P=0.000) with a better global recognition higher in girls (M=8.8) than boys (M=7.8) (P=0.013) and a positive correlation between global recognition ability and verbal IQ (P=0.000). Children who have less daytime sleep at 18 months and those who sleep less at 24 months show a better recognition of sadness (P=0.043 and P=0.042); those with difficulties at bedtime at 18 months recognize less happiness (P=0.043), and those who awaken earlier at 24 months have a better global recognition of emotions (P=0.015). Finally, the boys of the high-risk group recognize sadness better than boys in the control group (P=0.015). This study confirms that the recognition of emotion is related to development with a female advantage and a link with the language skills at 36 months of life. More importantly, we found a relationship between sleep characteristics and emotional recognition ability and a negative bias in emotional recognition in young males at risk for depression. Copyright © 2016 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Mothersill, David; Dillon, Rachael; Hargreaves, April; Castorina, Marco; Furey, Emilia; Fagan, Andrew J; Meaney, James F; Fitzmaurice, Brian; Hallahan, Brian; McDonald, Colm; Wykes, Til; Corvin, Aiden; Robertson, Ian H; Donohoe, Gary
2018-05-27
Working memory based cognitive remediation therapy (CT) for psychosis has recently been associated with broad improvements in performance on untrained tasks measuring working memory, episodic memory and IQ, and changes in associated brain regions. However, it is unclear if these improvements transfer to the domain of social cognition and neural activity related to performance on social cognitive tasks. We examined performance on the Reading the Mind in the Eyes test (Eyes test) in a large sample of participants with psychosis who underwent working memory based CT (N = 43) compared to a Control Group of participants with psychosis (N = 35). In a subset of this sample, we used functional magnetic resonance imaging (fMRI) to examine changes in neural activity during a facial emotion recognition task in participants who underwent CT (N = 15) compared to a Control Group (N = 15). No significant effects of CT were observed on Eyes test performance or on neural activity during facial emotion recognition, either at p<0.05 family-wise error, or at a p<0.001 uncorrected threshold, within a priori social cognitive regions of interest. This study suggests that working memory based CT does not significantly impact an aspect of social cognition which was measured behaviourally and neurally. It provides further evidence that deficits in the ability to decode mental state from facial expressions are dissociable from working memory deficits, and suggests that future CT programs should target social cognition in addition to working memory for the purposes of further enhancing social function. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Lahera, Guillermo; Ruiz, Alicia; Brañas, Antía; Vicens, María; Orozco, Arantxa
Previous studies have linked processing speed with social cognition and functioning of patients with schizophrenia. A discriminant analysis is needed to determine the different components of this neuropsychological construct. This paper analyzes the impact of processing speed, reaction time and sustained attention on social functioning. 98 outpatients between 18 and 65 with DSM-5 diagnosis of schizophrenia, with a period of 3 months of clinical stability, were recruited. Sociodemographic and clinical data were collected, and the following variables were measured: processing speed (Trail Making Test [TMT], symbol coding [BACS], verbal fluency), simple and elective reaction time, sustained attention, recognition of facial emotions and global functioning. Processing speed (measured only through the BACS), sustained attention (CPT) and elective reaction time (but not simple) were associated with functioning. Recognizing facial emotions (FEIT) correlated significantly with scores on measures of processing speed (BACS, Animals, TMT), sustained attention (CPT) and reaction time. The linear regression model showed a significant relationship between functioning, emotion recognition (P=.015) and processing speed (P=.029). A deficit in processing speed and facial emotion recognition are associated with worse global functioning in patients with schizophrenia. Copyright © 2017 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
A study of speech emotion recognition based on hybrid algorithm
NASA Astrophysics Data System (ADS)
Zhu, Ju-xia; Zhang, Chao; Lv, Zhao; Rao, Yao-quan; Wu, Xiao-pei
2011-10-01
To effectively improve the recognition accuracy of the speech emotion recognition system, a hybrid algorithm which combines Continuous Hidden Markov Model (CHMM), All-Class-in-One Neural Network (ACON) and Support Vector Machine (SVM) is proposed. In SVM and ACON methods, some global statistics are used as emotional features, while in CHMM method, instantaneous features are employed. The recognition rate by the proposed method is 92.25%, with the rejection rate to be 0.78%. Furthermore, it obtains the relative increasing of 8.53%, 4.69% and 0.78% compared with ACON, CHMM and SVM methods respectively. The experiment result confirms the efficiency of distinguishing anger, happiness, neutral and sadness emotional states.
Biologically inspired emotion recognition from speech
NASA Astrophysics Data System (ADS)
Caponetti, Laura; Buscicchio, Cosimo Alessandro; Castellano, Giovanna
2011-12-01
Emotion recognition has become a fundamental task in human-computer interaction systems. In this article, we propose an emotion recognition approach based on biologically inspired methods. Specifically, emotion classification is performed using a long short-term memory (LSTM) recurrent neural network which is able to recognize long-range dependencies between successive temporal patterns. We propose to represent data using features derived from two different models: mel-frequency cepstral coefficients (MFCC) and the Lyon cochlear model. In the experimental phase, results obtained from the LSTM network and the two different feature sets are compared, showing that features derived from the Lyon cochlear model give better recognition results in comparison with those obtained with the traditional MFCC representation.
Effect of Dopamine Therapy on Nonverbal Affect Burst Recognition in Parkinson's Disease
Péron, Julie; Grandjean, Didier; Drapier, Sophie; Vérin, Marc
2014-01-01
Background Parkinson's disease (PD) provides a model for investigating the involvement of the basal ganglia and mesolimbic dopaminergic system in the recognition of emotions from voices (i.e., emotional prosody). Although previous studies of emotional prosody recognition in PD have reported evidence of impairment, none of them compared PD patients at different stages of the disease, or ON and OFF dopamine replacement therapy, making it difficult to determine whether their impairment was due to general cognitive deterioration or to a more specific dopaminergic deficit. Methods We explored the involvement of the dopaminergic pathways in the recognition of nonverbal affect bursts (onomatopoeias) in 15 newly diagnosed PD patients in the early stages of the disease, 15 PD patients in the advanced stages of the disease and 15 healthy controls. The early PD group was studied in two conditions: ON and OFF dopaminergic therapy. Results Results showed that the early PD patients performed more poorly in the ON condition than in the OFF one, for overall emotion recognition, as well as for the recognition of anger, disgust and fear. Additionally, for anger, the early PD ON patients performed more poorly than controls. For overall emotion recognition, both advanced PD patients and early PD ON patients performed more poorly than controls. Analysis of continuous ratings on target and nontarget visual analog scales confirmed these patterns of results, showing a systematic emotional bias in both the advanced PD and early PD ON (but not OFF) patients compared with controls. Conclusions These results i) confirm the involvement of the dopaminergic pathways and basal ganglia in emotional prosody recognition, and ii) suggest a possibly deleterious effect of dopatherapy on affective abilities in the early stages of PD. PMID:24651759
Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S
2018-02-01
The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Emotion recognition based on physiological changes in music listening.
Kim, Jonghwa; André, Elisabeth
2008-12-01
Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\\% and 70\\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
Making sense of self-conscious emotion: linking theory of mind and emotion in children with autism.
Heerey, Erin A; Keltner, Dacher; Capps, Lisa M
2003-12-01
Self-conscious emotions such as embarrassment and shame are associated with 2 aspects of theory of mind (ToM): (a) the ability to understand that behavior has social consequences in the eyes of others and (b) an understanding of social norms violations. The present study aimed to link ToM with the recognition of self-conscious emotion. Children with and without autism identified facial expressions conscious of self-conscious and non-self-conscious emotions from photographs. ToM was also measured. Children with autism performed more poorly than comparison children at identifying self-conscious emotions, though they did not differ in the recognition of non-self-conscious emotions. When ToM ability was statistically controlled, group differences in the recognition of self-conscious emotion disappeared. Discussion focused on the links between ToM and self-conscious emotion.
Rapid communication: Global-local processing affects recognition of distractor emotional faces.
Srinivasan, Narayanan; Gupta, Rashmi
2011-03-01
Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.
Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643