Family environment influences emotion recognition following paediatric traumatic brain injury.
Schmidt, Adam T; Orsten, Kimberley D; Hanten, Gerri R; Li, Xiaoqi; Levin, Harvey S
2010-01-01
This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Findings suggest family functioning variables--especially financial resources--can influence performance on an emotional processing task following TBI in children.
Family environment influences emotion recognition following paediatric traumatic brain injury
SCHMIDT, ADAM T.; ORSTEN, KIMBERLEY D.; HANTEN, GERRI R.; LI, XIAOQI; LEVIN, HARVEY S.
2011-01-01
Objective This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). Methods A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Conclusions Findings suggest family functioning variables—especially financial resources—can influence performance on an emotional processing task following TBI in children. PMID:21058900
Buratto, Luciano G.; Pottage, Claire L.; Brown, Charity; Morrison, Catriona M.; Schaefer, Alexandre
2014-01-01
Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated. PMID:25330251
Buratto, Luciano G; Pottage, Claire L; Brown, Charity; Morrison, Catriona M; Schaefer, Alexandre
2014-01-01
Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated.
Emotion recognition in Parkinson's disease: Static and dynamic factors.
Wasser, Cory I; Evans, Felicity; Kempnich, Clare; Glikmann-Johnston, Yifat; Andrews, Sophie C; Thyagarajan, Dominic; Stout, Julie C
2018-02-01
The authors tested the hypothesis that Parkinson's disease (PD) participants would perform better in an emotion recognition task with dynamic (video) stimuli compared to a task using only static (photograph) stimuli and compared performances on both tasks to healthy control participants. In a within-subjects study, 21 PD participants and 20 age-matched healthy controls performed both static and dynamic emotion recognition tasks. The authors used a 2-way analysis of variance (controlling for individual participant variance) to determine the effect of group (PD, control) on emotion recognition performance in static and dynamic facial recognition tasks. Groups did not significantly differ in their performances on the static and dynamic tasks; however, the trend was suggestive that PD participants performed worse than controls. PD participants may have subtle emotion recognition deficits that are not ameliorated by the addition of contextual cues, similar to those found in everyday scenarios. Consistent with previous literature, the results suggest that PD participants may have underlying emotion recognition deficits, which may impact their social functioning. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
ERIC Educational Resources Information Center
Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus
2010-01-01
The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…
Does cortisol modulate emotion recognition and empathy?
Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja
2016-04-01
Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Golan, Ofer; Baron-Cohen, Simon; Golan, Yael
2008-09-01
Children with autism spectrum conditions (ASC) have difficulties recognizing others' emotions. Research has mostly focused on basic emotion recognition, devoid of context. This study reports the results of a new task, assessing recognition of complex emotions and mental states in social contexts. An ASC group (n = 23) was compared to a general population control group (n = 24). Children with ASC performed lower than controls on the task. Using task scores, more than 87% of the participants were allocated to their group. This new test quantifies complex emotion and mental state recognition in life-like situations. Our findings reveal that children with ASC have residual difficulties in this aspect of empathy. The use of language-based compensatory strategies for emotion recognition is discussed.
Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R
2011-11-01
Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.
Baez, Sandra; Marengo, Juan; Perez, Ana; Huepe, David; Font, Fernanda Giralt; Rial, Veronica; Gonzalez-Gadea, María Luz; Manes, Facundo; Ibanez, Agustin
2015-09-01
Impaired social cognition has been claimed to be a mechanism underlying the development and maintenance of borderline personality disorder (BPD). One important aspect of social cognition is the theory of mind (ToM), a complex skill that seems to be influenced by more basic processes, such as executive functions (EF) and emotion recognition. Previous ToM studies in BPD have yielded inconsistent results. This study assessed the performance of BPD adults on ToM, emotion recognition, and EF tasks. We also examined whether EF and emotion recognition could predict the performance on ToM tasks. We evaluated 15 adults with BPD and 15 matched healthy controls using different tasks of EF, emotion recognition, and ToM. The results showed that BPD adults exhibited deficits in the three domains, which seem to be task-dependent. Furthermore, we found that EF and emotion recognition predicted the performance on ToM. Our results suggest that tasks that involve real-life social scenarios and contextual cues are more sensitive to detect ToM and emotion recognition deficits in BPD individuals. Our findings also indicate that (a) ToM variability in BPD is partially explained by individual differences on EF and emotion recognition; and (b) ToM deficits of BPD patients are partially explained by the capacity to integrate cues from face, prosody, gesture, and social context to identify the emotions and others' beliefs. © 2014 The British Psychological Society.
Eye-Gaze Analysis of Facial Emotion Recognition and Expression in Adolescents with ASD.
Wieckowski, Andrea Trubanova; White, Susan W
2017-01-01
Impaired emotion recognition and expression in individuals with autism spectrum disorder (ASD) may contribute to observed social impairment. The aim of this study was to examine the role of visual attention directed toward nonsocial aspects of a scene as a possible mechanism underlying recognition and expressive ability deficiency in ASD. One recognition and two expression tasks were administered. Recognition was assessed in force-choice paradigm, and expression was assessed during scripted and free-choice response (in response to emotional stimuli) tasks in youth with ASD (n = 20) and an age-matched sample of typically developing youth (n = 20). During stimulus presentation prior to response in each task, participants' eye gaze was tracked. Youth with ASD were less accurate at identifying disgust and sadness in the recognition task. They fixated less to the eye region of stimuli showing surprise. A group difference was found during the free-choice response task, such that those with ASD expressed emotion less clearly but not during the scripted task. Results suggest altered eye gaze to the mouth region but not the eye region as a candidate mechanism for decreased ability to recognize or express emotion. Findings inform our understanding of the association between social attention and emotion recognition and expression deficits.
von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L
2015-04-01
Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2) = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.
Familiarity and face emotion recognition in patients with schizophrenia.
Lahera, Guillermo; Herrera, Sara; Fernández, Cristina; Bardón, Marta; de los Ángeles, Victoria; Fernández-Liria, Alberto
2014-01-01
To assess the emotion recognition in familiar and unknown faces in a sample of schizophrenic patients and healthy controls. Face emotion recognition of 18 outpatients diagnosed with schizophrenia (DSM-IVTR) and 18 healthy volunteers was assessed with two Emotion Recognition Tasks using familiar faces and unknown faces. Each subject was accompanied by 4 familiar people (parents, siblings or friends), which were photographed by expressing the 6 Ekman's basic emotions. Face emotion recognition in familiar faces was assessed with this ad hoc instrument. In each case, the patient scored (from 1 to 10) the subjective familiarity and affective valence corresponding to each person. Patients with schizophrenia not only showed a deficit in the recognition of emotions on unknown faces (p=.01), but they also showed an even more pronounced deficit on familiar faces (p=.001). Controls had a similar success rate in the unknown faces task (mean: 18 +/- 2.2) and the familiar face task (mean: 17.4 +/- 3). However, patients had a significantly lower score in the familiar faces task (mean: 13.2 +/- 3.8) than in the unknown faces task (mean: 16 +/- 2.4; p<.05). In both tests, the highest number of errors was with emotions of anger and fear. Subjectively, the patient group showed a lower level of familiarity and emotional valence to their respective relatives (p<.01). The sense of familiarity may be a factor involved in the face emotion recognition and it may be disturbed in schizophrenia. © 2013.
Recognizing biological motion and emotions from point-light displays in autism spectrum disorders.
Nackaerts, Evelien; Wagemans, Johan; Helsen, Werner; Swinnen, Stephan P; Wenderoth, Nicole; Alaerts, Kaat
2012-01-01
One of the main characteristics of Autism Spectrum Disorder (ASD) are problems with social interaction and communication. Here, we explored ASD-related alterations in 'reading' body language of other humans. Accuracy and reaction times were assessed from two observational tasks involving the recognition of 'biological motion' and 'emotions' from point-light displays (PLDs). Eye movements were recorded during the completion of the tests. Results indicated that typically developed-participants were more accurate than ASD-subjects in recognizing biological motion or emotions from PLDs. No accuracy differences were revealed on two control-tasks (involving the indication of color-changes in the moving point-lights). Group differences in reaction times existed on all tasks, but effect sizes were higher for the biological and emotion recognition tasks. Biological motion recognition abilities were related to a person's ability to recognize emotions from PLDs. However, ASD-related atypicalities in emotion recognition could not entirely be attributed to more basic deficits in biological motion recognition, suggesting an additional ASD-specific deficit in recognizing the emotional dimension of the point light displays. Eye movements were assessed during the completion of tasks and results indicated that ASD-participants generally produced more saccades and shorter fixation-durations compared to the control-group. However, especially for emotion recognition, these altered eye movements were associated with reductions in task-performance.
Emotion Recognition in Frontotemporal Dementia and Alzheimer's Disease: A New Film-Based Assessment
Goodkind, Madeleine S.; Sturm, Virginia E.; Ascher, Elizabeth A.; Shdo, Suzanne M.; Miller, Bruce L.; Rankin, Katherine P.; Levenson, Robert W.
2015-01-01
Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. PMID:26010574
Lodder, Gerine M A; Scholte, Ron H J; Goossens, Luc; Engels, Rutger C M E; Verhagen, Maaike
2016-02-01
Based on the belongingness regulation theory (Gardner et al., 2005, Pers. Soc. Psychol. Bull., 31, 1549), this study focuses on the relationship between loneliness and social monitoring. Specifically, we examined whether loneliness relates to performance on three emotion recognition tasks and whether lonely individuals show increased gazing towards their conversation partner's faces in a real-life conversation. Study 1 examined 170 college students (Mage = 19.26; SD = 1.21) who completed an emotion recognition task with dynamic stimuli (morph task) and a micro(-emotion) expression recognition task. Study 2 examined 130 college students (Mage = 19.33; SD = 2.00) who completed the Reading the Mind in the Eyes Test and who had a conversation with an unfamiliar peer while their gaze direction was videotaped. In both studies, loneliness was measured using the UCLA Loneliness Scale version 3 (Russell, 1996, J. Pers. Assess., 66, 20). The results showed that loneliness was unrelated to emotion recognition on all emotion recognition tasks, but that it was related to increased gaze towards their conversation partner's faces. Implications for the belongingness regulation system of lonely individuals are discussed. © 2015 The British Psychological Society.
Recognizing Biological Motion and Emotions from Point-Light Displays in Autism Spectrum Disorders
Nackaerts, Evelien; Wagemans, Johan; Helsen, Werner; Swinnen, Stephan P.; Wenderoth, Nicole; Alaerts, Kaat
2012-01-01
One of the main characteristics of Autism Spectrum Disorder (ASD) are problems with social interaction and communication. Here, we explored ASD-related alterations in ‘reading’ body language of other humans. Accuracy and reaction times were assessed from two observational tasks involving the recognition of ‘biological motion’ and ‘emotions’ from point-light displays (PLDs). Eye movements were recorded during the completion of the tests. Results indicated that typically developed-participants were more accurate than ASD-subjects in recognizing biological motion or emotions from PLDs. No accuracy differences were revealed on two control-tasks (involving the indication of color-changes in the moving point-lights). Group differences in reaction times existed on all tasks, but effect sizes were higher for the biological and emotion recognition tasks. Biological motion recognition abilities were related to a person’s ability to recognize emotions from PLDs. However, ASD-related atypicalities in emotion recognition could not entirely be attributed to more basic deficits in biological motion recognition, suggesting an additional ASD-specific deficit in recognizing the emotional dimension of the point light displays. Eye movements were assessed during the completion of tasks and results indicated that ASD-participants generally produced more saccades and shorter fixation-durations compared to the control-group. However, especially for emotion recognition, these altered eye movements were associated with reductions in task-performance. PMID:22970227
Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka
2014-01-01
Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.
ERIC Educational Resources Information Center
Golan, Ofer; Baron-Cohen, Simon; Golan, Yael
2008-01-01
Children with autism spectrum conditions (ASC) have difficulties recognizing others' emotions. Research has mostly focused on "basic" emotion recognition, devoid of context. This study reports the results of a new task, assessing recognition of "complex" emotions and mental states in social contexts. An ASC group (n = 23) was compared to a general…
Recognition of emotion with temporal lobe epilepsy and asymmetrical amygdala damage.
Fowler, Helen L; Baker, Gus A; Tipples, Jason; Hare, Dougal J; Keller, Simon; Chadwick, David W; Young, Andrew W
2006-08-01
Impairments in emotion recognition occur when there is bilateral damage to the amygdala. In this study, ability to recognize auditory and visual expressions of emotion was investigated in people with asymmetrical amygdala damage (AAD) and temporal lobe epilepsy (TLE). Recognition of five emotions was tested across three participant groups: those with right AAD and TLE, those with left AAD and TLE, and a comparison group. Four tasks were administered: recognition of emotion from facial expressions, sentences describing emotion-laden situations, nonverbal sounds, and prosody. Accuracy scores for each task and emotion were analysed, and no consistent overall effect of AAD on emotion recognition was found. However, some individual participants with AAD were significantly impaired at recognizing emotions, in both auditory and visual domains. The findings indicate that a minority of individuals with AAD have impairments in emotion recognition, but no evidence of specific impairments (e.g., visual or auditory) was found.
Emotion recognition in frontotemporal dementia and Alzheimer's disease: A new film-based assessment.
Goodkind, Madeleine S; Sturm, Virginia E; Ascher, Elizabeth A; Shdo, Suzanne M; Miller, Bruce L; Rankin, Katherine P; Levenson, Robert W
2015-08-01
Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. (c) 2015 APA, all rights reserved).
Test battery for measuring the perception and recognition of facial expressions of emotion
Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner
2014-01-01
Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528
Test-retest reliability and task order effects of emotional cognitive tests in healthy subjects.
Adams, Thomas; Pounder, Zoe; Preston, Sally; Hanson, Andy; Gallagher, Peter; Harmer, Catherine J; McAllister-Williams, R Hamish
2016-11-01
Little is known of the retest reliability of emotional cognitive tasks or the impact of using different tasks employing similar emotional stimuli within a battery. We investigated this in healthy subjects. We found improved overall performance in an emotional attentional blink task (EABT) with repeat testing at one hour and one week compared to baseline, but the impact of an emotional stimulus on performance was unchanged. Similarly, performance on a facial expression recognition task (FERT) was better one week after a baseline test, though the relative effect of specific emotions was unaltered. There was no effect of repeat testing on an emotional word categorising, recall and recognition task. We found no difference in performance in the FERT and EABT irrespective of task order. We concluded that it is possible to use emotional cognitive tasks in longitudinal studies and combine tasks using emotional facial stimuli in a single battery.
A multimodal approach to emotion recognition ability in autism spectrum disorders.
Jones, Catherine R G; Pickles, Andrew; Falcaro, Milena; Marsden, Anita J S; Happé, Francesca; Scott, Sophie K; Sauter, Disa; Tregay, Jenifer; Phillips, Rebecca J; Baird, Gillian; Simonoff, Emily; Charman, Tony
2011-03-01
Autism spectrum disorders (ASD) are characterised by social and communication difficulties in day-to-day life, including problems in recognising emotions. However, experimental investigations of emotion recognition ability in ASD have been equivocal, hampered by small sample sizes, narrow IQ range and over-focus on the visual modality. We tested 99 adolescents (mean age 15;6 years, mean IQ 85) with an ASD and 57 adolescents without an ASD (mean age 15;6 years, mean IQ 88) on a facial emotion recognition task and two vocal emotion recognition tasks (one verbal; one non-verbal). Recognition of happiness, sadness, fear, anger, surprise and disgust were tested. Using structural equation modelling, we conceptualised emotion recognition ability as a multimodal construct, measured by the three tasks. We examined how the mean levels of recognition of the six emotions differed by group (ASD vs. non-ASD) and IQ (≥ 80 vs. < 80). We found no evidence of a fundamental emotion recognition deficit in the ASD group and analysis of error patterns suggested that the ASD group were vulnerable to the same pattern of confusions between emotions as the non-ASD group. However, recognition ability was significantly impaired in the ASD group for surprise. IQ had a strong and significant effect on performance for the recognition of all six emotions, with higher IQ adolescents outperforming lower IQ adolescents. The findings do not suggest a fundamental difficulty with the recognition of basic emotions in adolescents with ASD. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.
Koelkebeck, Katja; Kohl, Waldemar; Luettgenau, Julia; Triantafillou, Susanna; Ohrmann, Patricia; Satoh, Shinji; Minoshita, Seiko
2015-07-30
A novel emotion recognition task that employs photos of a Japanese mask representing a highly ambiguous stimulus was evaluated. As non-Asians perceive and/or label emotions differently from Asians, we aimed to identify patterns of task-performance in non-Asian healthy volunteers with a view to future patient studies. The Noh mask test was presented to 42 adult German participants. Reaction times and emotion attribution patterns were recorded. To control for emotion identification abilities, a standard emotion recognition task was used among others. Questionnaires assessed personality traits. Finally, results were compared to age- and gender-matched Japanese volunteers. Compared to other tasks, German participants displayed slowest reaction times on the Noh mask test, indicating higher demands of ambiguous emotion recognition. They assigned more positive emotions to the mask than Japanese volunteers, demonstrating culture-dependent emotion identification patterns. As alexithymic and anxious traits were associated with slower reaction times, personality dimensions impacted on performance, as well. We showed an advantage of ambiguous over conventional emotion recognition tasks. Moreover, we determined emotion identification patterns in Western individuals impacted by personality dimensions, suggesting performance differences in clinical samples. Due to its properties, the Noh mask test represents a promising tool in the differential diagnosis of psychiatric disorders, e.g. schizophrenia. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Specific Impairments in the Recognition of Emotional Facial Expressions in Parkinson’s Disease
Clark, Uraina S.; Neargarder, Sandy; Cronin-Golomb, Alice
2008-01-01
Studies investigating the ability to recognize emotional facial expressions in non-demented individuals with Parkinson’s disease (PD) have yielded equivocal findings. A possible reason for this variability may lie in the confounding of emotion recognition with cognitive task requirements, a confound arising from the lack of a control condition using non-emotional stimuli. The present study examined emotional facial expression recognition abilities in 20 non-demented patients with PD and 23 control participants relative to their performances on a non-emotional landscape categorization test with comparable task requirements. We found that PD participants were normal on the control task but exhibited selective impairments in the recognition of facial emotion, specifically for anger (driven by those with right hemisphere pathology) and surprise (driven by those with left hemisphere pathology), even when controlling for depression level. Male but not female PD participants further displayed specific deficits in the recognition of fearful expressions. We suggest that the neural substrates that may subserve these impairments include the ventral striatum, amygdala, and prefrontal cortices. Finally, we observed that in PD participants, deficiencies in facial emotion recognition correlated with higher levels of interpersonal distress, which calls attention to the significant psychosocial impact that facial emotion recognition impairments may have on individuals with PD. PMID:18485422
Facial emotion recognition in patients with focal and diffuse axonal injury.
Yassin, Walid; Callahan, Brandy L; Ubukata, Shiho; Sugihara, Genichi; Murai, Toshiya; Ueda, Keita
2017-01-01
Facial emotion recognition impairment has been well documented in patients with traumatic brain injury. Studies exploring the neural substrates involved in such deficits have implicated specific grey matter structures (e.g. orbitofrontal regions), as well as diffuse white matter damage. Our study aims to clarify whether different types of injuries (i.e. focal vs. diffuse) will lead to different types of impairments on facial emotion recognition tasks, as no study has directly compared these patients. The present study examined performance and response patterns on a facial emotion recognition task in 14 participants with diffuse axonal injury (DAI), 14 with focal injury (FI) and 22 healthy controls. We found that, overall, participants with FI and DAI performed more poorly than controls on the facial emotion recognition task. Further, we observed comparable emotion recognition performance in participants with FI and DAI, despite differences in the nature and distribution of their lesions. However, the rating response pattern between the patient groups was different. This is the first study to show that pure DAI, without gross focal lesions, can independently lead to facial emotion recognition deficits and that rating patterns differ depending on the type and location of trauma.
Preti, Emanuele; Richetin, Juliette; Suttora, Chiara; Pisani, Alberto
2016-04-30
Dysfunctions in social cognition characterize personality disorders. However, mixed results emerged from literature on emotion processing. Borderline Personality Disorder (BPD) traits are either associated with enhanced emotion recognition, impairments, or equal functioning compared to controls. These apparent contradictions might result from the complexity of emotion recognition tasks used and from individual differences in impulsivity and effortful control. We conducted a study in a sample of undergraduate students (n=80), assessing BPD traits, using an emotion recognition task that requires the processing of only visual information or both visual and acoustic information. We also measured individual differences in impulsivity and effortful control. Results demonstrated the moderating role of some components of impulsivity and effortful control on the capability of BPD traits in predicting anger and happiness recognition. We organized the discussion around the interaction between different components of regulatory functioning and task complexity for a better understanding of emotion recognition in BPD samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Kumfor, Fiona; Irish, Muireann; Hodges, John R.; Piguet, Olivier
2013-01-01
Patients with frontotemporal dementia have pervasive changes in emotion recognition and social cognition, yet the neural changes underlying these emotion processing deficits remain unclear. The multimodal system model of emotion proposes that basic emotions are dependent on distinct brain regions, which undergo significant pathological changes in frontotemporal dementia. As such, this syndrome may provide important insight into the impact of neural network degeneration upon the innate ability to recognise emotions. This study used voxel-based morphometry to identify discrete neural correlates involved in the recognition of basic emotions (anger, disgust, fear, sadness, surprise and happiness) in frontotemporal dementia. Forty frontotemporal dementia patients (18 behavioural-variant, 11 semantic dementia, 11 progressive nonfluent aphasia) and 27 healthy controls were tested on two facial emotion recognition tasks: The Ekman 60 and Ekman Caricatures. Although each frontotemporal dementia group showed impaired recognition of negative emotions, distinct associations between emotion-specific task performance and changes in grey matter intensity emerged. Fear recognition was associated with the right amygdala; disgust recognition with the left insula; anger recognition with the left middle and superior temporal gyrus; and sadness recognition with the left subcallosal cingulate, indicating that discrete neural substrates are necessary for emotion recognition in frontotemporal dementia. The erosion of emotion-specific neural networks in neurodegenerative disorders may produce distinct profiles of performance that are relevant to understanding the neurobiological basis of emotion processing. PMID:23805313
Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643
Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.
Facial emotion recognition is inversely correlated with tremor severity in essential tremor.
Auzou, Nicolas; Foubert-Samier, Alexandra; Dupouy, Sandrine; Meissner, Wassilios G
2014-04-01
We here assess limbic and orbitofrontal control in 20 patients with essential tremor (ET) and 18 age-matched healthy controls using the Ekman Facial Emotion Recognition Task and the IOWA Gambling Task. Our results show an inverse relation between facial emotion recognition and tremor severity. ET patients also showed worse performance in joy and fear recognition, as well as subtle abnormalities in risk detection, but these differences did not reach significance after correction for multiple testing.
Facial Emotion Recognition in Bipolar Disorder and Healthy Aging.
Altamura, Mario; Padalino, Flavia A; Stella, Eleonora; Balzotti, Angela; Bellomo, Antonello; Palumbo, Rocco; Di Domenico, Alberto; Mammarella, Nicola; Fairfield, Beth
2016-03-01
Emotional face recognition is impaired in bipolar disorder, but it is not clear whether this is specific for the illness. Here, we investigated how aging and bipolar disorder influence dynamic emotional face recognition. Twenty older adults, 16 bipolar patients, and 20 control subjects performed a dynamic affective facial recognition task and a subsequent rating task. Participants pressed a key as soon as they were able to discriminate whether the neutral face was assuming a happy or angry facial expression and then rated the intensity of each facial expression. Results showed that older adults recognized happy expressions faster, whereas bipolar patients recognized angry expressions faster. Furthermore, both groups rated emotional faces more intensely than did the control subjects. This study is one of the first to compare how aging and clinical conditions influence emotional facial recognition and underlines the need to consider the role of specific and common factors in emotional face recognition.
The involvement of emotion recognition in affective theory of mind.
Mier, Daniela; Lis, Stefanie; Neuthe, Kerstin; Sauer, Carina; Esslinger, Christine; Gallhofer, Bernd; Kirsch, Peter
2010-11-01
This study was conducted to explore the relationship between emotion recognition and affective Theory of Mind (ToM). Forty subjects performed a facial emotion recognition and an emotional intention recognition task (affective ToM) in an event-related fMRI study. Conjunction analysis revealed overlapping activation during both tasks. Activation in some of these conjunctly activated regions was even stronger during affective ToM than during emotion recognition, namely in the inferior frontal gyrus, the superior temporal sulcus, the temporal pole, and the amygdala. In contrast to previous studies investigating ToM, we found no activation in the anterior cingulate, commonly assumed as the key region for ToM. The results point to a close relationship of emotion recognition and affective ToM and can be interpreted as evidence for the assumption that at least basal forms of ToM occur by an embodied, non-cognitive process. Copyright © 2010 Society for Psychophysiological Research.
Scotland, Jennifer L; McKenzie, Karen; Cossar, Jill; Murray, Aja; Michie, Amanda
2016-01-01
This study aimed to evaluate the emotion recognition abilities of adults (n=23) with an intellectual disability (ID) compared with a control group of children (n=23) without ID matched for estimated cognitive ability. The study examined the impact of: task paradigm, stimulus type and preferred processing style (global/local) on accuracy. We found that, after controlling for estimated cognitive ability, the control group performed significantly better than the individuals with ID. This provides some support for the emotion specificity hypothesis. Having a more local processing style did not significantly mediate the relation between having ID and emotion recognition, but did significantly predict emotion recognition ability after controlling for group. This suggests that processing style is related to emotion recognition independently of having ID. The availability of contextual information improved emotion recognition for people with ID when compared with line drawing stimuli, and identifying a target emotion from a choice of two was relatively easier for individuals with ID, compared with the other task paradigms. The results of the study are considered in the context of current theories of emotion recognition deficits in individuals with ID. Copyright © 2015 Elsevier Ltd. All rights reserved.
Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja
2016-09-01
Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.
ERIC Educational Resources Information Center
Rojahn, Johannes; And Others
1995-01-01
This literature review discusses 21 studies on facial emotion recognition by persons with mental retardation in terms of methodological characteristics, stimulus material, salient variables and their relation to recognition tasks, and emotion recognition deficits in mental retardation. A table provides comparative data on all 21 studies. (DB)
Sully, K; Sonuga-Barke, E J S; Fairchild, G
2015-07-01
There is accumulating evidence of impairments in facial emotion recognition in adolescents with conduct disorder (CD). However, the majority of studies in this area have only been able to demonstrate an association, rather than a causal link, between emotion recognition deficits and CD. To move closer towards understanding the causal pathways linking emotion recognition problems with CD, we studied emotion recognition in the unaffected first-degree relatives of CD probands, as well as those with a diagnosis of CD. Using a family-based design, we investigated facial emotion recognition in probands with CD (n = 43), their unaffected relatives (n = 21), and healthy controls (n = 38). We used the Emotion Hexagon task, an alternative forced-choice task using morphed facial expressions depicting the six primary emotions, to assess facial emotion recognition accuracy. Relative to controls, the CD group showed impaired recognition of anger, fear, happiness, sadness and surprise (all p < 0.005). Similar to probands with CD, unaffected relatives showed deficits in anger and happiness recognition relative to controls (all p < 0.008), with a trend toward a deficit in fear recognition. There were no significant differences in performance between the CD probands and the unaffected relatives following correction for multiple comparisons. These results suggest that facial emotion recognition deficits are present in adolescents who are at increased familial risk for developing antisocial behaviour, as well as those who have already developed CD. Consequently, impaired emotion recognition appears to be a viable familial risk marker or candidate endophenotype for CD.
Drapeau, Joanie; Gosselin, Nathalie; Peretz, Isabelle; McKerral, Michelle
2017-01-01
To assess emotion recognition from dynamic facial, vocal and musical expressions in sub-groups of adults with traumatic brain injuries (TBI) of different severities and identify possible common underlying mechanisms across domains. Forty-one adults participated in this study: 10 with moderate-severe TBI, nine with complicated mild TBI, 11 with uncomplicated mild TBI and 11 healthy controls, who were administered experimental (emotional recognition, valence-arousal) and control tasks (emotional and structural discrimination) for each domain. Recognition of fearful faces was significantly impaired in moderate-severe and in complicated mild TBI sub-groups, as compared to those with uncomplicated mild TBI and controls. Effect sizes were medium-large. Participants with lower GCS scores performed more poorly when recognizing fearful dynamic facial expressions. Emotion recognition from auditory domains was preserved following TBI, irrespective of severity. All groups performed equally on control tasks, indicating no perceptual disorders. Although emotional recognition from vocal and musical expressions was preserved, no correlation was found across auditory domains. This preliminary study may contribute to improving comprehension of emotional recognition following TBI. Future studies of larger samples could usefully include measures of functional impacts of recognition deficits for fearful facial expressions. These could help refine interventions for emotional recognition following a brain injury.
Dissociation between facial and bodily expressions in emotion recognition: A case study.
Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo
2017-12-21
Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.
Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.
2016-01-01
The study examined children’s recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children (N = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills, which included five tasks (three with faces and two with bodies). Parents and teachers reported on children’s aggressive behaviors and social skills. Children’s emotion accuracy on two of the three facial tasks and one of the body tasks was related to teacher reports of social skills. Some of these relations were moderated by child gender. In particular, the relationships between emotion recognition accuracy and reports of children’s behavior were stronger for boys than girls. Identifying preschool-aged children’s strengths and weaknesses in identification of emotion from faces and body poses may be helpful in guiding interventions with children who have problems with social and behavioral functioning that may be due, in part, to emotional knowledge deficits. Further developmental implications of these findings are discussed. PMID:27057129
Facial emotion recognition and borderline personality pathology.
Meehan, Kevin B; De Panfilis, Chiara; Cain, Nicole M; Antonucci, Camilla; Soliani, Antonio; Clarkin, John F; Sambataro, Fabio
2017-09-01
The impact of borderline personality pathology on facial emotion recognition has been in dispute; with impaired, comparable, and enhanced accuracy found in high borderline personality groups. Discrepancies are likely driven by variations in facial emotion recognition tasks across studies (stimuli type/intensity) and heterogeneity in borderline personality pathology. This study evaluates facial emotion recognition for neutral and negative emotions (fear/sadness/disgust/anger) presented at varying intensities. Effortful control was evaluated as a moderator of facial emotion recognition in borderline personality. Non-clinical multicultural undergraduates (n = 132) completed a morphed facial emotion recognition task of neutral and negative emotional expressions across different intensities (100% Neutral; 25%/50%/75% Emotion) and self-reported borderline personality features and effortful control. Greater borderline personality features related to decreased accuracy in detecting neutral faces, but increased accuracy in detecting negative emotion faces, particularly at low-intensity thresholds. This pattern was moderated by effortful control; for individuals with low but not high effortful control, greater borderline personality features related to misattributions of emotion to neutral expressions, and enhanced detection of low-intensity emotional expressions. Individuals with high borderline personality features may therefore exhibit a bias toward detecting negative emotions that are not or barely present; however, good self-regulatory skills may protect against this potential social-cognitive vulnerability. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Emotion-attention interactions in recognition memory for distractor faces.
Srinivasan, Narayanan; Gupta, Rashmi
2010-04-01
Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. Copyright 2010 APA, all rights reserved.
A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders
Xavier, Jean; Vignaud, Violaine; Ruggiero, Rosa; Bodeau, Nicolas; Cohen, David; Chaby, Laurence
2015-01-01
Although deficits in emotion recognition have been widely reported in autism spectrum disorder (ASD), experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N = 19) and with typical development (TD, N = 19), considering uni (faces and voices) and multimodal (faces/voices simultaneously) stimuli and developmental comorbidities (neuro-visual, language and motor impairments). Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1) the difficulties they experienced with visual stimuli were partially alleviated with multimodal stimuli. (2) Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3) Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory) tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found. We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension. PMID:26733928
Emotion recognition and oxytocin in patients with schizophrenia
Averbeck, B. B.; Bobin, T.; Evans, S.; Shergill, S. S.
2012-01-01
Background Studies have suggested that patients with schizophrenia are impaired at recognizing emotions. Recently, it has been shown that the neuropeptide oxytocin can have beneficial effects on social behaviors. Method To examine emotion recognition deficits in patients and see whether oxytocin could improve these deficits, we carried out two experiments. In the first experiment we recruited 30 patients with schizophrenia and 29 age- and IQ-matched control subjects, and gave them an emotion recognition task. Following this, we carried out a second experiment in which we recruited 21 patients with schizophrenia for a double-blind, placebo-controlled cross-over study of the effects of oxytocin on the same emotion recognition task. Results In the first experiment we found that patients with schizophrenia had a deficit relative to controls in recognizing emotions. In the second experiment we found that administration of oxytocin improved the ability of patients to recognize emotions. The improvement was consistent and occurred for most emotions, and was present whether patients were identifying morphed or non-morphed faces. Conclusions These data add to a growing literature showing beneficial effects of oxytocin on social–behavioral tasks, as well as clinical symptoms. PMID:21835090
Influences on Facial Emotion Recognition in Deaf Children
ERIC Educational Resources Information Center
Sidera, Francesc; Amadó, Anna; Martínez, Laura
2017-01-01
This exploratory research is aimed at studying facial emotion recognition abilities in deaf children and how they relate to linguistic skills and the characteristics of deafness. A total of 166 participants (75 deaf) aged 3-8 years were administered the following tasks: facial emotion recognition, naming vocabulary and cognitive ability. The…
Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel
2013-01-01
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122
Repetition and brain potentials when recognizing natural scenes: task and emotion differences
Bradley, Margaret M.; Codispoti, Maurizio; Karlsson, Marie; Lang, Peter J.
2013-01-01
Repetition has long been known to facilitate memory performance, but its effects on event-related potentials (ERPs), measured as an index of recognition memory, are less well characterized. In Experiment 1, effects of both massed and distributed repetition on old–new ERPs were assessed during an immediate recognition test that followed incidental encoding of natural scenes that also varied in emotionality. Distributed repetition at encoding enhanced both memory performance and the amplitude of an old–new ERP difference over centro-parietal sensors. To assess whether these repetition effects reflect encoding or retrieval differences, the recognition task was replaced with passive viewing of old and new pictures in Experiment 2. In the absence of an explicit recognition task, ERPs were completely unaffected by repetition at encoding, and only emotional pictures prompted a modestly enhanced old–new difference. Taken together, the data suggest that repetition facilitates retrieval processes and that, in the absence of an explicit recognition task, differences in old–new ERPs are only apparent for affective cues. PMID:22842817
Golan, Ofer; Baron-Cohen, Simon; Hill, Jacqueline
2006-02-01
Adults with Asperger Syndrome (AS) can recognise simple emotions and pass basic theory of mind tasks, but have difficulties recognising more complex emotions and mental states. This study describes a new battery of tasks, testing recognition of 20 complex emotions and mental states from faces and voices. The battery was given to males and females with AS and matched controls. Results showed the AS group performed worse than controls overall, on emotion recognition from faces and voices and on 12/20 specific emotions. Females recognised faces better than males regardless of diagnosis, and males with AS had more difficulties recognising emotions from faces than from voices. The implications of these results are discussed in relation to social functioning in AS.
ERIC Educational Resources Information Center
Evers, Kris; Steyaert, Jean; Noens, Ilse; Wagemans, Johan
2015-01-01
Emotion labelling was evaluated in two matched samples of 6-14-year old children with and without an autism spectrum disorder (ASD; N = 45 and N = 50, resp.), using six dynamic facial expressions. The Emotion Recognition Task proved to be valuable demonstrating subtle emotion recognition difficulties in ASD, as we showed a general poorer emotion…
Cerami, Chiara; Dodich, Alessandra; Iannaccone, Sandro; Marcone, Alessandra; Lettieri, Giada; Crespi, Chiara; Gianolli, Luigi; Cappa, Stefano F.; Perani, Daniela
2015-01-01
The behavioural variant of frontotemporal dementia (bvFTD) is a rare disease mainly affecting the social brain. FDG-PET fronto-temporal hypometabolism is a supportive feature for the diagnosis. It may also provide specific functional metabolic signatures for altered socio-emotional processing. In this study, we evaluated the emotion recognition and attribution deficits and FDG-PET cerebral metabolic patterns at the group and individual levels in a sample of sporadic bvFTD patients, exploring the cognitive-functional correlations. Seventeen probable mild bvFTD patients (10 male and 7 female; age 67.8±9.9) were administered standardized and validated version of social cognition tasks assessing the recognition of basic emotions and the attribution of emotions and intentions (i.e., Ekman 60-Faces test-Ek60F and Story-based Empathy task-SET). FDG-PET was analysed using an optimized voxel-based SPM method at the single-subject and group levels. Severe deficits of emotion recognition and processing characterized the bvFTD condition. At the group level, metabolic dysfunction in the right amygdala, temporal pole, and middle cingulate cortex was highly correlated to the emotional recognition and attribution performances. At the single-subject level, however, heterogeneous impairments of social cognition tasks emerged, and different metabolic patterns, involving limbic structures and prefrontal cortices, were also observed. The derangement of a right limbic network is associated with altered socio-emotional processing in bvFTD patients, but different hypometabolic FDG-PET patterns and heterogeneous performances on social tasks at an individual level exist. PMID:26513651
Biases in facial and vocal emotion recognition in chronic schizophrenia
Dondaine, Thibaut; Robert, Gabriel; Péron, Julie; Grandjean, Didier; Vérin, Marc; Drapier, Dominique; Millet, Bruno
2014-01-01
There has been extensive research on impaired emotion recognition in schizophrenia in the facial and vocal modalities. The literature points to biases toward non-relevant emotions for emotional faces but few studies have examined biases in emotional recognition across different modalities (facial and vocal). In order to test emotion recognition biases, we exposed 23 patients with stabilized chronic schizophrenia and 23 healthy controls (HCs) to emotional facial and vocal tasks asking them to rate emotional intensity on visual analog scales. We showed that patients with schizophrenia provided higher intensity ratings on the non-target scales (e.g., surprise scale for fear stimuli) than HCs for the both tasks. Furthermore, with the exception of neutral vocal stimuli, they provided the same intensity ratings on the target scales as the HCs. These findings suggest that patients with chronic schizophrenia have emotional biases when judging emotional stimuli in the visual and vocal modalities. These biases may stem from a basic sensorial deficit, a high-order cognitive dysfunction, or both. The respective roles of prefrontal-subcortical circuitry and the basal ganglia are discussed. PMID:25202287
Social approach and emotion recognition in fragile X syndrome.
Williams, Tracey A; Porter, Melanie A; Langdon, Robyn
2014-03-01
Evidence is emerging that individuals with Fragile X syndrome (FXS) display emotion recognition deficits, which may contribute to their significant social difficulties. The current study investigated the emotion recognition abilities, and social approachability judgments, of FXS individuals when processing emotional stimuli. Relative to chronological age- (CA-) and mental age- (MA-) matched controls, the FXS group performed significantly more poorly on the emotion recognition tasks, and displayed a bias towards detecting negative emotions. Moreover, after controlling for emotion recognition deficits, the FXS group displayed significantly reduced ratings of social approachability. These findings suggest that a social anxiety pattern, rather than poor socioemotional processing, may best explain the social avoidance observed in FXS.
Emotional Recognition in Autism Spectrum Conditions from Voices and Faces
ERIC Educational Resources Information Center
Stewart, Mary E.; McAdam, Clair; Ota, Mitsuhiko; Peppe, Sue; Cleland, Joanne
2013-01-01
The present study reports on a new vocal emotion recognition task and assesses whether people with autism spectrum conditions (ASC) perform differently from typically developed individuals on tests of emotional identification from both the face and the voice. The new test of vocal emotion contained trials in which the vocal emotion of the sentence…
Corcoran, C M; Keilp, J G; Kayser, J; Klim, C; Butler, P D; Bruder, G E; Gur, R C; Javitt, D C
2015-10-01
Schizophrenia is characterized by profound and disabling deficits in the ability to recognize emotion in facial expression and tone of voice. Although these deficits are well documented in established schizophrenia using recently validated tasks, their predictive utility in at-risk populations has not been formally evaluated. The Penn Emotion Recognition and Discrimination tasks, and recently developed measures of auditory emotion recognition, were administered to 49 clinical high-risk subjects prospectively followed for 2 years for schizophrenia outcome, and 31 healthy controls, and a developmental cohort of 43 individuals aged 7-26 years. Deficit in emotion recognition in at-risk subjects was compared with deficit in established schizophrenia, and with normal neurocognitive growth curves from childhood to early adulthood. Deficits in emotion recognition significantly distinguished at-risk patients who transitioned to schizophrenia. By contrast, more general neurocognitive measures, such as attention vigilance or processing speed, were non-predictive. The best classification model for schizophrenia onset included both face emotion processing and negative symptoms, with accuracy of 96%, and area under the receiver-operating characteristic curve of 0.99. In a parallel developmental study, emotion recognition abilities were found to reach maturity prior to traditional age of risk for schizophrenia, suggesting they may serve as objective markers of early developmental insult. Profound deficits in emotion recognition exist in at-risk patients prior to schizophrenia onset. They may serve as an index of early developmental insult, and represent an effective target for early identification and remediation. Future studies investigating emotion recognition deficits at both mechanistic and predictive levels are strongly encouraged.
Emotion Recognition in Face and Body Motion in Bulimia Nervosa.
Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate
2017-11-01
Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.
Recognition of emotion from body language among patients with unipolar depression
Loi, Felice; Vaidya, Jatin G.; Paradiso, Sergio
2013-01-01
Major depression may be associated with abnormal perception of emotions and impairment in social adaptation. Emotion recognition from body language and its possible implications to social adjustment have not been examined in patients with depression. Three groups of participants (51 with depression; 68 with history of depression in remission; and 69 never depressed healthy volunteers) were compared on static and dynamic tasks of emotion recognition from body language. Psychosocial adjustment was assessed using the Social Adjustment Scale Self-Report (SAS-SR). Participants with current depression showed reduced recognition accuracy for happy stimuli across tasks relative to remission and comparison participants. Participants with depression tended to show poorer psychosocial adaptation relative to remission and comparison groups. Correlations between perception accuracy of happiness and scores on the SAS-SR were largely not significant. These results indicate that depression is associated with reduced ability to appraise positive stimuli of emotional body language but emotion recognition performance is not tied to social adjustment. These alterations do not appear to be present in participants in remission suggesting state-like qualities. PMID:23608159
McIntosh, Lindsey G; Mannava, Sishir; Camalier, Corrie R; Folley, Bradley S; Albritton, Aaron; Konrad, Peter E; Charles, David; Park, Sohee; Neimat, Joseph S
2014-01-01
Parkinson's disease (PD) is traditionally regarded as a neurodegenerative movement disorder, however, nigrostriatal dopaminergic degeneration is also thought to disrupt non-motor loops connecting basal ganglia to areas in frontal cortex involved in cognition and emotion processing. PD patients are impaired on tests of emotion recognition, but it is difficult to disentangle this deficit from the more general cognitive dysfunction that frequently accompanies disease progression. Testing for emotion recognition deficits early in the disease course, prior to cognitive decline, better assesses the sensitivity of these non-motor corticobasal ganglia-thalamocortical loops involved in emotion processing to early degenerative change in basal ganglia circuits. In addition, contrasting this with a group of healthy aging individuals demonstrates changes in emotion processing specific to the degeneration of basal ganglia circuitry in PD. Early PD patients (EPD) were recruited from a randomized clinical trial testing the safety and tolerability of deep brain stimulation (DBS) of the subthalamic nucleus (STN-DBS) in early-staged PD. EPD patients were previously randomized to receive optimal drug therapy only (ODT), or drug therapy plus STN-DBS (ODT + DBS). Matched healthy elderly controls (HEC) and young controls (HYC) also participated in this study. Participants completed two control tasks and three emotion recognition tests that varied in stimulus domain. EPD patients were impaired on all emotion recognition tasks compared to HEC. Neither therapy type (ODT or ODT + DBS) nor therapy state (ON/OFF) altered emotion recognition performance in this study. Finally, HEC were impaired on vocal emotion recognition relative to HYC, suggesting a decline related to healthy aging. This study supports the existence of impaired emotion recognition early in the PD course, implicating an early disruption of fronto-striatal loops mediating emotional function.
Rieffe, Carolien; Wiefferink, Carin H
2017-03-01
The capacity for emotion recognition and understanding is crucial for daily social functioning. We examined to what extent this capacity is impaired in young children with a Language Impairment (LI). In typical development, children learn to recognize emotions in faces and situations through social experiences and social learning. Children with LI have less access to these experiences and are therefore expected to fall behind their peers without LI. In this study, 89 preschool children with LI and 202 children without LI (mean age 3 years and 10 months in both groups) were tested on three indices for facial emotion recognition (discrimination, identification, and attribution in emotion evoking situations). Parents reported on their children's emotion vocabulary and ability to talk about their own emotions. Preschoolers with and without LI performed similarly on the non-verbal task for emotion discrimination. Children with LI fell behind their peers without LI on both other tasks for emotion recognition that involved labelling the four basic emotions (happy, sad, angry, fear). The outcomes of these two tasks were also related to children's level of emotion language. These outcomes emphasize the importance of 'emotion talk' at the youngest age possible for children with LI. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Wright, Barry; Clarke, Natalie; Jordan, Jo; Young, Andrew W.; Clarke, Paula; Miles, Jeremy; Nation, Kate; Clarke, Leesa; Williams, Christine
2008-01-01
We compared young people with high-functioning autism spectrum disorders (ASDs) with age, sex and IQ matched controls on emotion recognition of faces and pictorial context. Each participant completed two tests of emotion recognition. The first used Ekman series faces. The second used facial expressions in visual context. A control task involved…
ERIC Educational Resources Information Center
Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.
2013-01-01
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…
Interference with facial emotion recognition by verbal but not visual loads.
Reed, Phil; Steed, Ian
2015-12-01
The ability to recognize emotions through facial characteristics is critical for social functioning, but is often impaired in those with a developmental or intellectual disability. The current experiments explored the degree to which interfering with the processing capacities of typically-developing individuals would produce a similar inability to recognize emotions through the facial elements of faces displaying particular emotions. It was found that increasing the cognitive load (in an attempt to model learning impairments in a typically developing population) produced deficits in correctly identifying emotions from facial elements. However, this effect was much more pronounced when using a concurrent verbal task than when employing a concurrent visual task, suggesting that there is a substantial verbal element to the labeling and subsequent recognition of emotions. This concurs with previous work conducted with those with developmental disabilities that suggests emotion recognition deficits are connected with language deficits. Copyright © 2015 Elsevier Ltd. All rights reserved.
Auditory processing deficits in bipolar disorder with and without a history of psychotic features.
Zenisek, RyAnna; Thaler, Nicholas S; Sutton, Griffin P; Ringdahl, Erik N; Snyder, Joel S; Allen, Daniel N
2015-11-01
Auditory perception deficits have been identified in schizophrenia (SZ) and linked to dysfunction in the auditory cortex. Given that psychotic symptoms, including auditory hallucinations, are also seen in bipolar disorder (BD), it may be that individuals with BD who also exhibit psychotic symptoms demonstrate a similar impairment in auditory perception. Fifty individuals with SZ, 30 individuals with bipolar I disorder with a history of psychosis (BD+), 28 individuals with bipolar I disorder with no history of psychotic features (BD-), and 29 normal controls (NC) were administered a tone discrimination task and an emotion recognition task. Mixed-model analyses of covariance with planned comparisons indicated that individuals with BD+ performed at a level that was intermediate between those with BD- and those with SZ on the more difficult condition of the tone discrimination task and on the auditory condition of the emotion recognition task. There were no differences between the BD+ and BD- groups on the visual or auditory-visual affect recognition conditions. Regression analyses indicated that performance on the tone discrimination task predicted performance on all conditions of the emotion recognition task. Auditory hallucinations in BD+ were not related to performance on either task. Our findings suggested that, although deficits in frequency discrimination and emotion recognition are more severe in SZ, these impairments extend to BD+. Although our results did not support the idea that auditory hallucinations may be related to these deficits, they indicated that basic auditory deficits may be a marker for psychosis, regardless of SZ or BD diagnosis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
The development of cross-cultural recognition of vocal emotion during childhood and adolescence.
Chronaki, Georgia; Wigelsworth, Michael; Pell, Marc D; Kotz, Sonja A
2018-06-14
Humans have an innate set of emotions recognised universally. However, emotion recognition also depends on socio-cultural rules. Although adults recognise vocal emotions universally, they identify emotions more accurately in their native language. We examined developmental trajectories of universal vocal emotion recognition in children. Eighty native English speakers completed a vocal emotion recognition task in their native language (English) and foreign languages (Spanish, Chinese, and Arabic) expressing anger, happiness, sadness, fear, and neutrality. Emotion recognition was compared across 8-to-10, 11-to-13-year-olds, and adults. Measures of behavioural and emotional problems were also taken. Results showed that although emotion recognition was above chance for all languages, native English speaking children were more accurate in recognising vocal emotions in their native language. There was a larger improvement in recognising vocal emotion from the native language during adolescence. Vocal anger recognition did not improve with age for the non-native languages. This is the first study to demonstrate universality of vocal emotion recognition in children whilst supporting an "in-group advantage" for more accurate recognition in the native language. Findings highlight the role of experience in emotion recognition, have implications for child development in modern multicultural societies and address important theoretical questions about the nature of emotions.
Explicit and spontaneous retrieval of emotional scenes: electrophysiological correlates.
Weymar, Mathias; Bradley, Margaret M; El-Hinnawi, Nasryn; Lang, Peter J
2013-10-01
When event-related potentials (ERP) are measured during a recognition task, items that have previously been presented typically elicit a larger late (400-800 ms) positive potential than new items. Recent data, however, suggest that emotional, but not neutral, pictures show ERP evidence of spontaneous retrieval when presented in a free-viewing task (Ferrari, Bradley, Codispoti, Karlsson, & Lang, 2012). In two experiments, we further investigated the brain dynamics of implicit and explicit retrieval. In Experiment 1, brain potentials were measured during a semantic categorization task, which did not explicitly probe episodic memory, but which, like a recognition task, required an active decision and a button press, and were compared to those elicited during recognition and free viewing. Explicit recognition prompted a late enhanced positivity for previously presented, compared with new, pictures regardless of hedonic content. In contrast, only emotional pictures showed an old-new difference when the task did not explicitly probe episodic memory, either when making an active categorization decision regarding picture content, or when simply viewing pictures. In Experiment 2, however, neutral pictures did prompt a significant old-new ERP difference during subsequent free viewing when emotionally arousing pictures were not included in the encoding set. These data suggest that spontaneous retrieval is heightened for salient cues, perhaps reflecting heightened attention and elaborative processing at encoding.
Explicit and spontaneous retrieval of emotional scenes: Electrophysiological correlates
Weymar, Mathias; Bradley, Margaret M.; El-Hinnawi, Nasryn; Lang, Peter J.
2014-01-01
When event-related potentials are measured during a recognition task, items that have previously been presented typically elicit a larger late (400–800 ms) positive potential than new items. Recent data, however, suggest that emotional, but not neutral, pictures show ERP evidence of spontaneous retrieval when presented in a free-viewing task (Ferrari, Bradley, Codispoti & Lang, 2012). In two experiments, we further investigated the brain dynamics of implicit and explicit retrieval. In Experiment 1, brain potentials were measured during a semantic categorization task, which did not explicitly probe episodic memory, but which, like a recognition task, required an active decision and a button press, and were compared to those elicited during recognition and free viewing. Explicit recognition prompted a late enhanced positivity for previously presented, compared to new, pictures regardless of hedonic content. In contrast, only emotional pictures showed an old-new difference when the task did not explicitly probe episodic memory, either when either making an active categorization decision regarding picture content, or when simply viewing pictures. In Experiment 2, however, neutral pictures did prompt a significant old-new ERP difference during subsequent free viewing when emotionally arousing pictures were not included in the encoding set. These data suggest that spontaneous retrieval is heightened for salient cues, perhaps reflecting heightened attention and elaborative processing at encoding. PMID:23795588
Enrici, Ivan; Adenzato, Mauro; Ardito, Rita B.; Mitkova, Antonia; Cavallo, Marco; Zibetti, Maurizio; Lopiano, Leonardo; Castelli, Lorys
2015-01-01
Background Parkinson’s disease (PD) is characterised by well-known motor symptoms, whereas the presence of cognitive non-motor symptoms, such as emotional disturbances, is still underestimated. One of the major problems in studying emotion deficits in PD is an atomising approach that does not take into account different levels of emotion elaboration. Our study addressed the question of whether people with PD exhibit difficulties in one or more specific dimensions of emotion processing, investigating three different levels of analyses, that is, recognition, representation, and regulation. Methodology Thirty-two consecutive medicated patients with PD and 25 healthy controls were enrolled in the study. Participants performed a three-level analysis assessment of emotional processing using quantitative standardised emotional tasks: the Ekman 60-Faces for emotion recognition, the full 36-item version of the Reading the Mind in the Eyes (RME) for emotion representation, and the 20-item Toronto Alexithymia Scale (TAS-20) for emotion regulation. Principal Findings Regarding emotion recognition, patients obtained significantly worse scores than controls in the total score of Ekman 60-Faces but not in any other basic emotions. For emotion representation, patients obtained significantly worse scores than controls in the RME experimental score but no in the RME gender control task. Finally, on emotion regulation, PD and controls did not perform differently at TAS-20 and no specific differences were found on TAS-20 subscales. The PD impairments on emotion recognition and representation do not correlate with dopamine therapy, disease severity, or with the duration of illness. These results are independent from other cognitive processes, such as global cognitive status and executive function, or from psychiatric status, such as depression, anxiety or apathy. Conclusions These results may contribute to better understanding of the emotional problems that are often seen in patients with PD and the measures used to test these problems, in particular on the use of different versions of the RME task. PMID:26110271
Recognition and Posing of Emotional Expressions by Abused Children and Their Mothers.
ERIC Educational Resources Information Center
Camras, Linda A.; And Others
1988-01-01
A total of 20 abused and 20 nonabused pairs of children of three-seven years and their mothers participated in a facial expression posing task and a facial expression recognition task. Findings suggest that abused children may not observe as often as nonabused children do the easily interpreted voluntary displays of emotion by their mothers. (RH)
Music to my ears: Age-related decline in musical and facial emotion recognition.
Sutcliffe, Ryan; Rendell, Peter G; Henry, Julie D; Bailey, Phoebe E; Ruffman, Ted
2017-12-01
We investigated young-old differences in emotion recognition using music and face stimuli and tested explanatory hypotheses regarding older adults' typically worse emotion recognition. In Experiment 1, young and older adults labeled emotions in an established set of faces, and in classical piano stimuli that we pilot-tested on other young and older adults. Older adults were worse at detecting anger, sadness, fear, and happiness in music. Performance on the music and face emotion tasks was not correlated for either age group. Because musical expressions of fear were not equated for age groups in the pilot study of Experiment 1, we conducted a second experiment in which we created a novel set of music stimuli that included more accessible musical styles, and which we again pilot-tested on young and older adults. In this pilot study, all musical emotions were identified similarly by young and older adults. In Experiment 2, participants also made age estimations in another set of faces to examine whether potential relations between the face and music emotion tasks would be shared with the age estimation task. Older adults did worse in each of the tasks, and had specific difficulty recognizing happy, sad, peaceful, angry, and fearful music clips. Older adults' difficulties in each of the 3 tasks-music emotion, face emotion, and face age-were not correlated with each other. General cognitive decline did not appear to explain our results as increasing age predicted emotion performance even after fluid IQ was controlled for within the older adult group. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Kosaka, H; Omori, M; Murata, T; Iidaka, T; Yamada, H; Okada, T; Takahashi, T; Sadato, N; Itoh, H; Yonekura, Y; Wada, Y
2002-09-01
Human lesion or neuroimaging studies suggest that amygdala is involved in facial emotion recognition. Although impairments in recognition of facial and/or emotional expression have been reported in schizophrenia, there are few neuroimaging studies that have examined differential brain activation during facial recognition between patients with schizophrenia and normal controls. To investigate amygdala responses during facial recognition in schizophrenia, we conducted a functional magnetic resonance imaging (fMRI) study with 12 right-handed medicated patients with schizophrenia and 12 age- and sex-matched healthy controls. The experiment task was a type of emotional intensity judgment task. During the task period, subjects were asked to view happy (or angry/disgusting/sad) and neutral faces simultaneously presented every 3 s and to judge which face was more emotional (positive or negative face discrimination). Imaging data were investigated in voxel-by-voxel basis for single-group analysis and for between-group analysis according to the random effect model using Statistical Parametric Mapping (SPM). No significant difference in task accuracy was found between the schizophrenic and control groups. Positive face discrimination activated the bilateral amygdalae of both controls and schizophrenics, with more prominent activation of the right amygdala shown in the schizophrenic group. Negative face discrimination activated the bilateral amygdalae in the schizophrenic group whereas the right amygdala alone in the control group, although no significant group difference was found. Exaggerated amygdala activation during emotional intensity judgment found in the schizophrenic patients may reflect impaired gating of sensory input containing emotion. Copyright 2002 Elsevier Science B.V.
Sleep deprivation impairs the accurate recognition of human emotions.
van der Helm, Els; Gujar, Ninad; Walker, Matthew P
2010-03-01
Investigate the impact of sleep deprivation on the ability to recognize the intensity of human facial emotions. Randomized total sleep-deprivation or sleep-rested conditions, involving between-group and within-group repeated measures analysis. Experimental laboratory study. Thirty-seven healthy participants, (21 females) aged 18-25 y, were randomly assigned to the sleep control (SC: n = 17) or total sleep deprivation group (TSD: n = 20). Participants performed an emotional face recognition task, in which they evaluated 3 different affective face categories: Sad, Happy, and Angry, each ranging in a gradient from neutral to increasingly emotional. In the TSD group, the task was performed once under conditions of sleep deprivation, and twice under sleep-rested conditions following different durations of sleep recovery. In the SC group, the task was performed twice under sleep-rested conditions, controlling for repeatability. In the TSD group, when sleep-deprived, there was a marked and significant blunting in the recognition of Angry and Happy affective expressions in the moderate (but not extreme) emotional intensity range; differences that were most reliable and significant in female participants. No change in the recognition of Sad expressions was observed. These recognition deficits were, however, ameliorated following one night of recovery sleep. No changes in task performance were observed in the SC group. Sleep deprivation selectively impairs the accurate judgment of human facial emotions, especially threat relevant (Anger) and reward relevant (Happy) categories, an effect observed most significantly in females. Such findings suggest that sleep loss impairs discrete affective neural systems, disrupting the identification of salient affective social cues.
"We all look the same to me": positive emotions eliminate the own-race in face recognition.
Johnson, Kareem J; Fredrickson, Barbara L
2005-11-01
Extrapolating from the broaden-and-build theory, we hypothesized that positive emotion may reduce the own-race bias in facial recognition. In Experiments 1 and 2, Caucasian participants (N = 89) viewed Black and White faces for a recognition task. They viewed videos eliciting joy, fear, or neutrality before the learning (Experiment 1) or testing (Experiment 2) stages of the task. Results reliably supported the hypothesis. Relative to fear or a neutral state, joy experienced before either stage improved recognition of Black faces and significantly reduced the own-race bias. Discussion centers on possible mechanisms for this reduction of the own-race bias, including improvements in holistic processing and promotion of a common in-group identity due to positive emotions.
Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N
2015-06-01
The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
ERIC Educational Resources Information Center
Cebula, Katie R.; Wishart, Jennifer G.; Willis, Diane S.; Pitcairn, Tom K.
2017-01-01
Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched,…
Facial Emotion Recognition in Child Psychiatry: A Systematic Review
ERIC Educational Resources Information Center
Collin, Lisa; Bindra, Jasmeet; Raju, Monika; Gillberg, Christopher; Minnis, Helen
2013-01-01
This review focuses on facial affect (emotion) recognition in children and adolescents with psychiatric disorders other than autism. A systematic search, using PRISMA guidelines, was conducted to identify original articles published prior to October 2011 pertaining to face recognition tasks in case-control studies. Used in the qualitative…
Emotion Understanding in Children with ADHD
ERIC Educational Resources Information Center
Da Fonseca, David; Seguier, Valerie; Santos, Andreia; Poinso, Francois; Deruelle, Christine
2009-01-01
Several studies suggest that children with ADHD tend to perform worse than typically developing children on emotion recognition tasks. However, most of these studies have focused on the recognition of facial expression, while there is evidence that context plays a major role on emotion perception. This study aims at further investigating emotion…
Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.
Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M
2017-01-01
Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).
Basic and complex emotion recognition in children with autism: cross-cultural findings.
Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer
2016-01-01
Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.
Gonzalez-Gadea, Maria Luz; Herrera, Eduar; Parra, Mario; Gomez Mendez, Pedro; Baez, Sandra; Manes, Facundo; Ibanez, Agustin
2014-01-01
Emotion recognition and empathy abilities require the integration of contextual information in real-life scenarios. Previous reports have explored these domains in adolescent offenders (AOs) but have not used tasks that replicate everyday situations. In this study we included ecological measures with different levels of contextual dependence to evaluate emotion recognition and empathy in AOs relative to non-offenders, controlling for the effect of demographic variables. We also explored the influence of fluid intelligence (FI) and executive functions (EFs) in the prediction of relevant deficits in these domains. Our results showed that AOs exhibit deficits in context-sensitive measures of emotion recognition and cognitive empathy. Difficulties in these tasks were neither explained by demographic variables nor predicted by FI or EFs. However, performance on measures that included simpler stimuli or could be solved by explicit knowledge was either only partially affected by demographic variables or preserved in AOs. These findings indicate that AOs show contextual social-cognition impairments which are relatively independent of basic cognitive functioning and demographic variables. PMID:25374529
Rigon, Arianna; Turkstra, Lyn; Mutlu, Bilge; Duff, Melissa
2016-10-01
Although moderate to severe traumatic brain injury (TBI) leads to facial affect recognition impairments in up to 39% of individuals, protective and risk factors for these deficits are unknown. The aim of the current study was to examine the effect of sex on emotion recognition abilities following TBI. We administered two separate emotion recognition tests (one static and one dynamic) to 53 individuals with moderate to severe TBI (females = 28) and 49 demographically matched comparisons (females = 22). We then investigated the presence of a sex-by-group interaction in emotion recognition accuracy. In the comparison group, there were no sex differences. In the TBI group, however, females significantly outperformed males in the dynamic (but not the static) task. Moreover, males (but not females) with TBI performed significantly worse than comparison participants in the dynamic task. Further analysis revealed that sex differences in emotion recognition abilities within the TBI group could not be explained by lesion location, TBI severity, or other neuropsychological variables. These findings suggest that sex may serve as a protective factor for social impairment following TBI and inform clinicians working with TBI as well as research on the neurophysiological correlates of sex differences in social functioning.
Differential effects of MDMA and methylphenidate on social cognition.
Schmid, Yasmin; Hysek, Cédric M; Simmler, Linda D; Crockett, Molly J; Quednow, Boris B; Liechti, Matthias E
2014-09-01
Social cognition is important in everyday-life social interactions. The social cognitive effects of 3,4-methylenedioxymethamphetamine (MDMA, 'ecstasy') and methylphenidate (both used for neuroenhancement and as party drugs) are largely unknown. We investigated the acute effects of MDMA (75 mg), methylphenidate (40 mg) and placebo using the Facial Emotion Recognition Task, Multifaceted Empathy Test, Movie for the Assessment of Social Cognition, Social Value Orientation Test and the Moral Judgment Task in a cross-over study in 30 healthy subjects. Additionally, subjective, autonomic, pharmacokinetic, endocrine and adverse drug effects were measured. MDMA enhanced emotional empathy for positive emotionally charged situations in the MET and tended to reduce the recognition of sad faces in the Facial Emotion Recognition Task. MDMA had no effects on cognitive empathy in the Multifaceted Empathy Test or social cognitive inferences in the Movie for the Assessment of Social Cognition. MDMA produced subjective 'empathogenic' effects, such as drug liking, closeness to others, openness and trust. In contrast, methylphenidate lacked such subjective effects and did not alter emotional processing, empathy or mental perspective-taking. MDMA but not methylphenidate increased the plasma levels of oxytocin and prolactin. None of the drugs influenced moral judgment. Effects on emotion recognition and emotional empathy were evident at a low dose of MDMA and likely contribute to the popularity of the drug. © The Author(s) 2014.
Emotional content enhances true but not false memory for categorized stimuli.
Choi, Hae-Yoon; Kensinger, Elizabeth A; Rajaram, Suparna
2013-04-01
Past research has shown that emotion enhances true memory, but that emotion can either increase or decrease false memory. Two theoretical possibilities-the distinctiveness of emotional stimuli and the conceptual relatedness of emotional content-have been implicated as being responsible for influencing both true and false memory for emotional content. In the present study, we sought to identify the mechanisms that underlie these mixed findings by equating the thematic relatedness of the study materials across each type of valence used (negative, positive, or neutral). In three experiments, categorically bound stimuli (e.g., funeral, pets, and office items) were used for this purpose. When the encoding task required the processing of thematic relatedness, a significant true-memory enhancement for emotional content emerged in recognition memory, but no emotional boost to false memory (exp. 1). This pattern persisted for true memory with a longer retention interval between study and test (24 h), and false recognition was reduced for emotional items (exp. 2). Finally, better recognition memory for emotional items once again emerged when the encoding task (arousal ratings) required the processing of the emotional aspect of the study items, with no emotional boost to false recognition (EXP. 3). Together, these findings suggest that when emotional and neutral stimuli are equivalently high in thematic relatedness, emotion continues to improve true memory, but it does not override other types of grouping to increase false memory.
Effect of anxiety on memory for emotional information in older adults.
Herrera, Sara; Montorio, Ignacio; Cabrera, Isabel
2017-04-01
Several studies have shown that anxiety is associated with a better memory of negative events. However, this anxiety-related memory bias has not been studied in the elderly, in which there is a preferential processing of positive information. To study the effect of anxiety in a recognition task and an autobiographical memory task in 102 older adults with high and low levels of trait anxiety. Negative, positive and neutral pictures were used in the recognition task. In the autobiographical memory task, memories of the participants' lives were recorded, how they felt when thinking about them, and the personal relevance of these memories. In the recognition task, no anxiety-related bias was found toward negative information. Individuals with high trait anxiety were found to remember less positive pictures than those with low trait anxiety. In the autobiographical memory task, both groups remembered negative and positive events equally. However, people with high trait anxiety remembered life experiences with more negative emotions, especially when remembering negative events. Individuals with low trait anxiety tended to feel more positive emotions when remembering their life experiences and most of these referred to feeling positive emotions when remembering negative events. Older adults with anxiety tend to recognize less positive information and to present more negative emotions when remembering life events; while individuals without anxiety have a more positive experience of negative memories.
Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L
2018-02-01
Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p < .001). After psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.
Buhlmann, Ulrike; Winter, Anna; Kathmann, Norbert
2013-03-01
Body dysmorphic disorder (BDD) is characterized by perceived appearance-related defects, often tied to aspects of the face or head (e.g., acne). Deficits in decoding emotional expressions have been examined in several psychological disorders including BDD. Previous research indicates that BDD is associated with impaired facial emotion recognition, particularly in situations that involve the BDD sufferer him/herself. The purpose of this study was to further evaluate the ability to read other people's emotions among 31 individuals with BDD, and 31 mentally healthy controls. We applied the Reading the Mind in the Eyes task, in which participants are presented with a series of pairs of eyes, one at a time, and are asked to identify the emotion that describes the stimulus best. The groups did not differ with respect to decoding other people's emotions by looking into their eyes. Findings are discussed in light of previous research examining emotion recognition in BDD. Copyright © 2013. Published by Elsevier Ltd.
Soravia, Leila M; Witmer, Joëlle S; Schwab, Simon; Nakataki, Masahito; Dierks, Thomas; Wiest, Roland; Henke, Katharina; Federspiel, Andrea; Jann, Kay
2016-03-01
Low self-referential thoughts are associated with better concentration, which leads to deeper encoding and increases learning and subsequent retrieval. There is evidence that being engaged in externally rather than internally focused tasks is related to low neural activity in the default mode network (DMN) promoting open mind and the deep elaboration of new information. Thus, reduced DMN activity should lead to enhanced concentration, comprehensive stimulus evaluation including emotional categorization, deeper stimulus processing, and better long-term retention over one whole week. In this fMRI study, we investigated brain activation preceding and during incidental encoding of emotional pictures and on subsequent recognition performance. During fMRI, 24 subjects were exposed to 80 pictures of different emotional valence and subsequently asked to complete an online recognition task one week later. Results indicate that neural activity within the medial temporal lobes during encoding predicts subsequent memory performance. Moreover, a low activity of the default mode network preceding incidental encoding leads to slightly better recognition performance independent of the emotional perception of a picture. The findings indicate that the suppression of internally-oriented thoughts leads to a more comprehensive and thorough evaluation of a stimulus and its emotional valence. Reduced activation of the DMN prior to stimulus onset is associated with deeper encoding and enhanced consolidation and retrieval performance even one week later. Even small prestimulus lapses of attention influence consolidation and subsequent recognition performance. © 2015 Wiley Periodicals, Inc.
Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.
Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M
2014-11-01
Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.
Autonomic imbalance is associated with reduced facial recognition in somatoform disorders.
Pollatos, Olga; Herbert, Beate M; Wankner, Sarah; Dietel, Anja; Wachsmuth, Cornelia; Henningsen, Peter; Sack, Martin
2011-10-01
Somatoform disorders are characterized by the presence of multiple somatic symptoms. While the accuracy of perceiving bodily signal (interoceptive awareness) is only sparely investigated in somatoform disorders, recent research has associated autonomic imbalance with cognitive and emotional difficulties in stress-related diseases. This study aimed to investigate how sympathovagal reactivity interacts with performance in recognizing emotions in faces (facial recognition task). Using a facial recognition and appraisal task, skin conductance levels (SCLs), heart rate (HR) and heart rate variability (HRV) were assessed in 26 somatoform patients and compared to healthy controls. Interoceptive awareness was assessed by a heartbeat detection task. We found evidence for a sympathovagal imbalance in somatoform disorders characterized by low parasympathetic reactivity during emotional tasks and increased sympathetic activation during baseline. Somatoform patients exhibited a reduced recognition performance for neutral and sad emotional expressions only. Possible confounding variables such as alexithymia, anxiety or depression were taken into account. Interoceptive awareness was reduced in somatoform patients. Our data demonstrate an imbalance in sympathovagal activation in somatoform disorders associated with decreased parasympathetic activation. This might account for difficulties in processing of sad and neutral facial expressions in somatoform patients which might be a pathogenic mechanism for increased everyday vulnerability. Copyright © 2011 Elsevier Inc. All rights reserved.
NK1 receptor antagonism and emotional processing in healthy volunteers.
Chandra, P; Hafizi, S; Massey-Chase, R M; Goodwin, G M; Cowen, P J; Harmer, C J
2010-04-01
The neurokinin-1 (NK(1)) receptor antagonist, aprepitant, showed activity in several animal models of depression; however, its efficacy in clinical trials was disappointing. There is little knowledge of the role of NK(1) receptors in human emotional behaviour to help explain this discrepancy. The aim of the current study was to assess the effects of a single oral dose of aprepitant (125 mg) on models of emotional processing sensitive to conventional antidepressant drug administration in 38 healthy volunteers, randomly allocated to receive aprepitant or placebo in a between groups double blind design. Performance on measures of facial expression recognition, emotional categorisation, memory and attentional visual-probe were assessed following the drug absorption. Relative to placebo, aprepitant improved recognition of happy facial expressions and increased vigilance to emotional information in the unmasked condition of the visual probe task. In contrast, aprepitant impaired emotional memory and slowed responses in the facial expression recognition task suggesting possible deleterious effects on cognition. These results suggest that while antagonism of NK(1) receptors does affect emotional processing in humans, its effects are more restricted and less consistent across tasks than those of conventional antidepressants. Human models of emotional processing may provide a useful means of assessing the likely therapeutic potential of new treatments for depression.
Food-Induced Emotional Resonance Improves Emotion Recognition.
Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia
2016-01-01
The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce-which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one.
Food-Induced Emotional Resonance Improves Emotion Recognition
Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia
2016-01-01
The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce—which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one. PMID:27973559
Action and Emotion Recognition from Point Light Displays: An Investigation of Gender Differences
Alaerts, Kaat; Nackaerts, Evelien; Meyns, Pieter; Swinnen, Stephan P.; Wenderoth, Nicole
2011-01-01
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as ‘reading’ the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of ‘biological motion’ versus ‘non-biological’ (or ‘scrambled’ motion); or (ii) the recognition of the ‘emotional state’ of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the ‘Reading the Mind in the Eyes Test’ (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree – be related to more basic differences in processing biological motion per se. PMID:21695266
Action and emotion recognition from point light displays: an investigation of gender differences.
Alaerts, Kaat; Nackaerts, Evelien; Meyns, Pieter; Swinnen, Stephan P; Wenderoth, Nicole
2011-01-01
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as 'reading' the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of 'biological motion' versus 'non-biological' (or 'scrambled' motion); or (ii) the recognition of the 'emotional state' of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the 'Reading the Mind in the Eyes Test' (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree - be related to more basic differences in processing biological motion per se.
Reyes, B Nicole; Segal, Shira C; Moulson, Margaret C
2018-01-01
Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.
An investigation of the effect of race-based social categorization on adults’ recognition of emotion
Reyes, B. Nicole; Segal, Shira C.
2018-01-01
Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one’s own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition. PMID:29474367
EMOTION RECOGNITION OF VIRTUAL AGENTS FACIAL EXPRESSIONS: THE EFFECTS OF AGE AND EMOTION INTENSITY
Beer, Jenay M.; Fisk, Arthur D.; Rogers, Wendy A.
2014-01-01
People make determinations about the social characteristics of an agent (e.g., robot or virtual agent) by interpreting social cues displayed by the agent, such as facial expressions. Although a considerable amount of research has been conducted investigating age-related differences in emotion recognition of human faces (e.g., Sullivan, & Ruffman, 2004), the effect of age on emotion identification of virtual agent facial expressions has been largely unexplored. Age-related differences in emotion recognition of facial expressions are an important factor to consider in the design of agents that may assist older adults in a recreational or healthcare setting. The purpose of the current research was to investigate whether age-related differences in facial emotion recognition can extend to emotion-expressive virtual agents. Younger and older adults performed a recognition task with a virtual agent expressing six basic emotions. Larger age-related differences were expected for virtual agents displaying negative emotions, such as anger, sadness, and fear. In fact, the results indicated that older adults showed a decrease in emotion recognition accuracy for a virtual agent's emotions of anger, fear, and happiness. PMID:25552896
Haldane, Morgan; Jogia, Jigar; Cobb, Annabel; Kozuch, Eliza; Kumari, Veena; Frangou, Sophia
2008-01-01
Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation.
Rigon, Arianna; Turkstra, Lyn; Mutlu, Bilge; Duff, Melissa
2018-01-01
Although moderate to severe traumatic brain injury (TBI) leads to facial affect recognition impairments in up to 39% of individuals, protective and risk factors for these deficits are unknown. The aim of the current study was to examine the effect of sex on emotion recognition abilities following TBI. We administered two separate emotion recognition tests (one static and one dynamic) to 53 individuals with moderate to severe TBI (Females=28) and 49 demographically matched comparisons (Females=22). We then investigated the presence of a sex-by-group interaction in emotion recognition accuracy. In the comparison group, there were no sex differences. In the TBI group, however, females significantly outperformed males in the dynamic (but not the static) task. Moreover, males (but not females) with TBI performed significantly worse than comparison participants in the dynamic task. Further analysis revealed that sex differences in emotion recognition abilities within the TBI group could not be explained by lesion location, TBI severity, or other neuropsychological variables. These findings suggest that sex may serve as a protective factor for social impairment following TBI and inform clinicians working with TBI as well as research on the neurophysiological correlates of sex differences in social functioning. PMID:27245826
Ehrlé, Nathalie; Henry, Audrey; Pesa, Audrey; Bakchine, Serge
2011-03-01
This paper presents a French battery designed to assess emotional and sociocognitive abilities in neurological patients in clinical practice. The first part of this battery includes subtests assessing emotions: a recognition task of primary facial emotions, a discrimination task of facial emotions, a task of expressive intensity judgment, a task of gender identification, a recognition task of musical emotions. The second part intends to assess some sociocognitive abilities, that is mainly theory of mind (attribution tasks of mental states to others: false believe tasks of first and second order, faux-pas task) and social norms (moral/conventional distinction task, social situations task) but also abstract language and humour. We present a general description of the battery with special attention to specific methodological constraints for the assessment of neurological patients. After a brief introduction to moral and conventional judgments (definition and current theoretical basis), the French version of the social norm task from RJR Blair (Blair and Cipolotti, 2000) is developed. The relevance of these tasks in frontal variant of frontotemporal dementia (fvFTD is illustrated by the report of the results of a study conducted in 18 patients by the Cambridge group and by the personal study of a patient with early stage of vfFTD. The relevance of the diagnostic of sociocognitive impairment in neurological patients is discussed.
Quantifying facial expression recognition across viewing conditions.
Goren, Deborah; Wilson, Hugh R
2006-04-01
Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.
Facial recognition deficits as a potential endophenotype in bipolar disorder.
Vierck, Esther; Porter, Richard J; Joyce, Peter R
2015-11-30
Bipolar disorder (BD) is considered a highly heritable and genetically complex disorder. Several cognitive functions, such as executive functions and verbal memory have been suggested as promising candidates for endophenotypes. Although there is evidence for deficits in facial emotion recognition in individuals with BD, studies investigating these functions as endophenotypes are rare. The current study investigates emotion recognition as a potential endophenotype in BD by comparing 36 BD participants, 24 of their 1st degree relatives and 40 healthy control participants in a computerised facial emotion recognition task. Group differences were evaluated using repeated measurement analysis of co-variance with age as a covariate. Results revealed slowed emotion recognition for both BD and their relatives. Furthermore, BD participants were less accurate than healthy controls in their recognition of emotion expressions. We found no evidence of emotion specific differences between groups. Our results provide evidence for facial recognition as a potential endophenotype in BD. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Emotion effects on implicit and explicit musical memory in normal aging.
Narme, Pauline; Peretz, Isabelle; Strub, Marie-Laure; Ergis, Anne-Marie
2016-12-01
Normal aging affects explicit memory while leaving implicit memory relatively spared. Normal aging also modifies how emotions are processed and experienced, with increasing evidence that older adults (OAs) focus more on positive information than younger adults (YAs). The aim of the present study was to investigate how age-related changes in emotion processing influence explicit and implicit memory. We used emotional melodies that differed in terms of valence (positive or negative) and arousal (high or low). Implicit memory was assessed with a preference task exploiting exposure effects, and explicit memory with a recognition task. Results indicated that effects of valence and arousal interacted to modulate both implicit and explicit memory in YAs. In OAs, recognition was poorer than in YAs; however, recognition of positive and high-arousal (happy) studied melodies was comparable. Insofar as socioemotional selectivity theory (SST) predicts a preservation of the recognition of positive information, our findings are not fully consistent with the extension of this theory to positive melodies since recognition of low-arousal (peaceful) studied melodies was poorer in OAs. In the preference task, YAs showed stronger exposure effects than OAs, suggesting an age-related decline of implicit memory. This impairment is smaller than the one observed for explicit memory (recognition), extending to the musical domain the dissociation between explicit memory decline and implicit memory relative preservation in aging. Finally, the disproportionate preference for positive material seen in OAs did not translate into stronger exposure effects for positive material suggesting no age-related emotional bias in implicit memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Intact anger recognition in depression despite aberrant visual facial information usage.
Clark, Cameron M; Chiu, Carina G; Diaz, Ruth L; Goghari, Vina M
2014-08-01
Previous literature has indicated abnormalities in facial emotion recognition abilities, as well as deficits in basic visual processes in major depression. However, the literature is unclear on a number of important factors including whether or not these abnormalities represent deficient or enhanced emotion recognition abilities compared to control populations, and the degree to which basic visual deficits might impact this process. The present study investigated emotion recognition abilities for angry versus neutral facial expressions in a sample of undergraduate students with Beck Depression Inventory-II (BDI-II) scores indicative of moderate depression (i.e., ≥20), compared to matched low-BDI-II score (i.e., ≤2) controls via the Bubbles Facial Emotion Perception Task. Results indicated unimpaired behavioural performance in discriminating angry from neutral expressions in the high depressive symptoms group relative to the minimal depressive symptoms group, despite evidence of an abnormal pattern of visual facial information usage. The generalizability of the current findings is limited by the highly structured nature of the facial emotion recognition task used, as well as the use of an analog sample undergraduates scoring high in self-rated symptoms of depression rather than a clinical sample. Our findings suggest that basic visual processes are involved in emotion recognition abnormalities in depression, demonstrating consistency with the emotion recognition literature in other psychopathologies (e.g., schizophrenia, autism, social anxiety). Future research should seek to replicate these findings in clinical populations with major depression, and assess the association between aberrant face gaze behaviours and symptom severity and social functioning. Copyright © 2014 Elsevier B.V. All rights reserved.
More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder
Goghari, Vina M; Sponheim, Scott R
2012-01-01
Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816
Gender differences in the relationship between social communication and emotion recognition.
Kothari, Radha; Skuse, David; Wakefield, Justin; Micali, Nadia
2013-11-01
To investigate the association between autistic traits and emotion recognition in a large community sample of children using facial and social motion cues, additionally stratifying by gender. A general population sample of 3,666 children from the Avon Longitudinal Study of Parents and Children (ALSPAC) were assessed on their ability to correctly recognize emotions using the faces subtest of the Diagnostic Analysis of Non-Verbal Accuracy, and the Emotional Triangles Task, a novel test assessing recognition of emotion from social motion cues. Children with autistic-like social communication difficulties, as assessed by the Social Communication Disorders Checklist, were compared with children without such difficulties. Autistic-like social communication difficulties were associated with poorer recognition of emotion from social motion cues in both genders, but were associated with poorer facial emotion recognition in boys only (odds ratio = 1.9, 95% CI = 1.4, 2.6, p = .0001). This finding must be considered in light of lower power to detect differences in girls. In this community sample of children, greater deficits in social communication skills are associated with poorer discrimination of emotions, implying there may be an underlying continuum of liability to the association between these characteristics. As a similar degree of association was observed in both genders on a novel test of social motion cues, the relatively good performance of girls on the more familiar task of facial emotion discrimination may be due to compensatory mechanisms. Our study might indicate the existence of a cognitive process by which girls with underlying autistic traits can compensate for their covert deficits in emotion recognition, although this would require further investigation. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Perspective taking in older age revisited: a motivational perspective.
Zhang, Xin; Fung, Helene H; Stanley, Jennifer T; Isaacowitz, Derek M; Ho, Man Yee
2013-10-01
How perspective-taking ability changes with age (i.e., whether older adults are better at understanding others' behaviors and intentions and show greater empathy to others or not) is not clear, with prior empirical findings on this phenomenon yielding mixed results. In a series of experiments, we investigated the phenomenon from a motivational perspective. Perceived closeness between participants and the experimenter (Study 1) or the target in an emotion recognition task (Study 2) was manipulated to examine whether the closeness could influence participants' performance in faux pas recognition (Study 1) and emotion recognition (Study 2). It was found that the well-documented negative age effect (i.e., older adults performed worse than younger adults in faux pas and emotion recognition tasks) was only replicated in the control condition for both tasks. When closeness was experimentally increased, older adults enhanced their performance, and they now performed at a comparable level as younger adults. Findings from the 2 experiments suggest that the reported poorer performance of older adults in perspective-taking tasks might be attributable to a lack of motivation instead of ability to perform in laboratory settings. With the presence of strong motivation, older adults have the ability to perform equally well as younger adults.
Wolf, Richard C; Pujara, Maia; Baskaya, Mustafa K; Koenigs, Michael
2016-09-01
Facial emotion recognition is a critical aspect of human communication. Since abnormalities in facial emotion recognition are associated with social and affective impairment in a variety of psychiatric and neurological conditions, identifying the neural substrates and psychological processes underlying facial emotion recognition will help advance basic and translational research on social-affective function. Ventromedial prefrontal cortex (vmPFC) has recently been implicated in deploying visual attention to the eyes of emotional faces, although there is mixed evidence regarding the importance of this brain region for recognition accuracy. In the present study of neurological patients with vmPFC damage, we used an emotion recognition task with morphed facial expressions of varying intensities to determine (1) whether vmPFC is essential for emotion recognition accuracy, and (2) whether instructed attention to the eyes of faces would be sufficient to improve any accuracy deficits. We found that vmPFC lesion patients are impaired, relative to neurologically healthy adults, at recognizing moderate intensity expressions of anger and that recognition accuracy can be improved by providing instructions of where to fixate. These results suggest that vmPFC may be important for the recognition of facial emotion through a role in guiding visual attention to emotionally salient regions of faces. Copyright © 2016 Elsevier Ltd. All rights reserved.
Attentional biases and memory for emotional stimuli in men and male rhesus monkeys.
Lacreuse, Agnès; Schatz, Kelly; Strazzullo, Sarah; King, Hanna M; Ready, Rebecca
2013-11-01
We examined attentional biases for social and non-social emotional stimuli in young adult men and compared the results to those of male rhesus monkeys (Macaca mulatta) previously tested in a similar dot-probe task (King et al. in Psychoneuroendocrinology 37(3):396-409, 2012). Recognition memory for these stimuli was also analyzed in each species, using a recognition memory task in humans and a delayed non-matching-to-sample task in monkeys. We found that both humans and monkeys displayed a similar pattern of attentional biases toward threatening facial expressions of conspecifics. The bias was significant in monkeys and of marginal significance in humans. In addition, humans, but not monkeys, exhibited an attentional bias away from negative non-social images. Attentional biases for social and non-social threat differed significantly, with both species showing a pattern of vigilance toward negative social images and avoidance of negative non-social images. Positive stimuli did not elicit significant attentional biases for either species. In humans, emotional content facilitated the recognition of non-social images, but no effect of emotion was found for the recognition of social images. Recognition accuracy was not affected by emotion in monkeys, but response times were faster for negative relative to positive images. Altogether, these results suggest shared mechanisms of social attention in humans and monkeys, with both species showing a pattern of selective attention toward threatening faces of conspecifics. These data are consistent with the view that selective vigilance to social threat is the result of evolutionary constraints. Yet, selective attention to threat was weaker in humans than in monkeys, suggesting that regulatory mechanisms enable non-anxious humans to reduce sensitivity to social threat in this paradigm, likely through enhanced prefrontal control and reduced amygdala activation. In addition, the findings emphasize important differences in attentional biases to social versus non-social threat in both species. Differences in the impact of emotional stimuli on recognition memory between monkeys and humans will require further study, as methodological differences in the recognition tasks may have affected the results.
Holding, Benjamin C; Laukka, Petri; Fischer, Håkan; Bänziger, Tanja; Axelsson, John; Sundelin, Tina
2017-11-01
Insufficient sleep has been associated with impaired recognition of facial emotions. However, previous studies have found inconsistent results, potentially stemming from the type of static picture task used. We therefore examined whether insufficient sleep was associated with decreased emotion recognition ability in two separate studies using a dynamic multimodal task. Study 1 used a cross-sectional design consisting of 291 participants with questionnaire measures assessing sleep duration and self-reported sleep quality for the previous night. Study 2 used an experimental design involving 181 participants where individuals were quasi-randomized into either a sleep-deprivation (N = 90) or a sleep-control (N = 91) condition. All participants from both studies were tested on the same forced-choice multimodal test of emotion recognition to assess the accuracy of emotion categorization. Sleep duration, self-reported sleep quality (study 1), and sleep deprivation (study 2) did not predict overall emotion recognition accuracy or speed. Similarly, the responses to each of the twelve emotions tested showed no evidence of impaired recognition ability, apart from one positive association suggesting that greater self-reported sleep quality could predict more accurate recognition of disgust (study 1). The studies presented here involve considerably larger samples than previous studies and the results support the null hypotheses. Therefore, we suggest that the ability to accurately categorize the emotions of others is not associated with short-term sleep duration or sleep quality and is resilient to acute periods of insufficient sleep. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.
Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James
2017-01-01
Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives.
Associations between facial emotion recognition and young adolescents’ behaviors in bullying
Gini, Gianluca; Altoè, Gianmarco
2017-01-01
This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years) who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students’ behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a “neutral” nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals. PMID:29131871
Thonse, Umesh; Behere, Rishikesh V; Praharaj, Samir Kumar; Sharma, Podila Sathya Venkata Narasimha
2018-06-01
Facial emotion recognition deficits have been consistently demonstrated in patients with severe mental disorders. Expressed emotion is found to be an important predictor of relapse. However, the relationship between facial emotion recognition abilities and expressed emotions and its influence on socio-occupational functioning in schizophrenia versus bipolar disorder has not been studied. In this study we examined 91 patients with schizophrenia and 71 with bipolar disorder for psychopathology, socio occupational functioning and emotion recognition abilities. Primary caregivers of 62 patients with schizophrenia and 49 with bipolar disorder were assessed on Family Attitude Questionnaire to assess their expressed emotions. Patients of schizophrenia and bipolar disorder performed similarly on the emotion recognition task. Patients with schizophrenia group experienced higher critical comments and had a poorer socio-occupational functioning as compared to patients with bipolar disorder. Poorer socio-occupational functioning in patients with schizophrenia was significantly associated with greater dissatisfaction in their caregivers. In patients with bipolar disorder, poorer emotion recognition scores significantly correlated with poorer adaptive living skills and greater hostility and dissatisfaction in their caregivers. The findings of our study suggest that emotion recognition abilities in patients with bipolar disorder are associated with negative expressed emotions leading to problems in adaptive living skills. Copyright © 2018 Elsevier B.V. All rights reserved.
Parents’ Emotion-Related Beliefs, Behaviors, and Skills Predict Children's Recognition of Emotion
Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.
2015-01-01
Children who are able to recognize others’ emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents’ own emotion-related beliefs, behaviors, and skills. We examined parents’ beliefs about the value of emotion and guidance of children's emotion, parents’ emotion labeling and teaching behaviors, and parents’ skill in recognizing children's emotions in relation to their school-aged children's emotion recognition skills. Sixty-nine parent-child dyads completed questionnaires, participated in dyadic laboratory tasks, and identified their own emotions and emotions felt by the other participant from videotaped segments. Regression analyses indicate that parents’ beliefs, behaviors, and skills together account for 37% of the variance in child emotion recognition ability, even after controlling for parent and child expressive clarity. The findings suggest the importance of the family milieu in the development of children's emotion recognition skill in middle childhood, and add to accumulating evidence suggesting important age-related shifts in the relation between parental emotion socialization and child emotional development. PMID:26005393
Sensory Contributions to Impaired Emotion Processing in Schizophrenia
Butler, Pamela D.; Abeles, Ilana Y.; Weiskopf, Nicole G.; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E.; Zemon, Vance; Loughead, James; Gur, Ruben C.; Javitt, Daniel C.
2009-01-01
Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective. PMID:19793797
Sensory contributions to impaired emotion processing in schizophrenia.
Butler, Pamela D; Abeles, Ilana Y; Weiskopf, Nicole G; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E; Zemon, Vance; Loughead, James; Gur, Ruben C; Javitt, Daniel C
2009-11-01
Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective.
Ernst, Monique; Luckenbaugh, David A; Moolchan, Eric T; Temple, Veronica A; Jenness, Jessica; Korelitz, Katherine E; London, Edythe D; Kimes, Alane S
2010-03-01
This 4-year longitudinal study examined whether performance on a decision-making task and an emotion-processing task predicted the initiation of tobacco, marijuana, or alcohol use among 77 adolescents. Of the participants, 64% met criteria for an externalizing behavioral disorder; 33% did not initiate substance use; 13% used one of the three substances under investigation, 18% used two, and 36% used all three. Initiation of substance use was associated with enhanced recognition of angry emotion, but not with risky decision-making. In conclusion, adolescents who initiate drug use present vulnerability in the form of bias towards negative emotion but not toward decisions that involve risk. Copyright 2009. Published by Elsevier Ltd.
Realmuto, Sabrina; Zummo, Leila; Cerami, Chiara; Agrò, Luigi; Dodich, Alessandra; Canessa, Nicola; Zizzo, Andrea; Fierro, Brigida; Daniele, Ornella
2015-06-01
Despite an extensive literature on cognitive impairments in focal and generalized epilepsy, only a few number of studies specifically explored social cognition disorders in epilepsy syndromes. The aim of our study was to investigate social cognition abilities in patients with temporal lobe epilepsy (TLE) and in patients with idiopathic generalized epilepsy (IGE). Thirty-nine patients (21 patients with TLE and 18 patients with IGE) and 21 matched healthy controls (HCs) were recruited. All subjects underwent a basic neuropsychological battery plus two experimental tasks evaluating emotion recognition from facial expression (Ekman-60-Faces test, Ek-60F) and mental state attribution (Story-based Empathy Task, SET). In particular, the latter is a newly developed task that assesses the ability to infer others' intentions (i.e., intention attribution - IA) and emotions (i.e., emotion attribution - EA) compared with a control condition of physical causality (i.e., causal inferences - CI). Compared with HCs, patients with TLE showed significantly lower performances on both social cognition tasks. In particular, all SET subconditions as well as the recognition of negative emotions were significantly impaired in patients with TLE vs. HCs. On the contrary, patients with IGE showed impairments on anger recognition only without any deficit at the SET task. Emotion recognition deficits occur in patients with epilepsy, possibly because of a global disruption of a pathway involving frontal, temporal, and limbic regions. Impairments of mental state attribution specifically characterize the neuropsychological profile of patients with TLE in the context of the in-depth temporal dysfunction typical of such patients. Impairments of socioemotional processing have to be considered as part of the neuropsychological assessment in both TLE and IGE in view of a correct management and for future therapeutic interventions. Copyright © 2015 Elsevier Inc. All rights reserved.
Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel
2013-01-01
To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.
Luebbe, Aaron M; Fussner, Lauren M; Kiel, Elizabeth J; Early, Martha C; Bell, Debora J
2013-12-01
Depressive symptomatology is associated with impaired recognition of emotion. Previous investigations have predominantly focused on emotion recognition of static facial expressions neglecting the influence of social interaction and critical contextual factors. In the current study, we investigated how youth and maternal symptoms of depression may be associated with emotion recognition biases during familial interactions across distinct contextual settings. Further, we explored if an individual's current emotional state may account for youth and maternal emotion recognition biases. Mother-adolescent dyads (N = 128) completed measures of depressive symptomatology and participated in three family interactions, each designed to elicit distinct emotions. Mothers and youth completed state affect ratings pertaining to self and other at the conclusion of each interaction task. Using multiple regression, depressive symptoms in both mothers and adolescents were associated with biased recognition of both positive affect (i.e., happy, excited) and negative affect (i.e., sadness, anger, frustration); however, this bias emerged primarily in contexts with a less strong emotional signal. Using actor-partner interdependence models, results suggested that youth's own state affect accounted for depression-related biases in their recognition of maternal affect. State affect did not function similarly in explaining depression-related biases for maternal recognition of adolescent emotion. Together these findings suggest a similar negative bias in emotion recognition associated with depressive symptoms in both adolescents and mothers in real-life situations, albeit potentially driven by different mechanisms.
Hargreaves, A; Mothersill, O; Anderson, M; Lawless, S; Corvin, A; Donohoe, G
2016-10-28
Deficits in facial emotion recognition have been associated with functional impairments in patients with Schizophrenia (SZ). Whilst a strong ecological argument has been made for the use of both dynamic facial expressions and varied emotion intensities in research, SZ emotion recognition studies to date have primarily used static stimuli of a singular, 100%, intensity of emotion. To address this issue, the present study aimed to investigate accuracy of emotion recognition amongst patients with SZ and healthy subjects using dynamic facial emotion stimuli of varying intensities. To this end an emotion recognition task (ERT) designed by Montagne (2007) was adapted and employed. 47 patients with a DSM-IV diagnosis of SZ and 51 healthy participants were assessed for emotion recognition. Results of the ERT were tested for correlation with performance in areas of cognitive ability typically found to be impaired in psychosis, including IQ, memory, attention and social cognition. Patients were found to perform less well than healthy participants at recognising each of the 6 emotions analysed. Surprisingly, however, groups did not differ in terms of impact of emotion intensity on recognition accuracy; for both groups higher intensity levels predicted greater accuracy, but no significant interaction between diagnosis and emotional intensity was found for any of the 6 emotions. Accuracy of emotion recognition was, however, more strongly correlated with cognition in the patient cohort. Whilst this study demonstrates the feasibility of using ecologically valid dynamic stimuli in the study of emotion recognition accuracy, varying the intensity of the emotion displayed was not demonstrated to impact patients and healthy participants differentially, and thus may not be a necessary variable to include in emotion recognition research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L
2017-03-01
Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).
Biologically inspired emotion recognition from speech
NASA Astrophysics Data System (ADS)
Caponetti, Laura; Buscicchio, Cosimo Alessandro; Castellano, Giovanna
2011-12-01
Emotion recognition has become a fundamental task in human-computer interaction systems. In this article, we propose an emotion recognition approach based on biologically inspired methods. Specifically, emotion classification is performed using a long short-term memory (LSTM) recurrent neural network which is able to recognize long-range dependencies between successive temporal patterns. We propose to represent data using features derived from two different models: mel-frequency cepstral coefficients (MFCC) and the Lyon cochlear model. In the experimental phase, results obtained from the LSTM network and the two different feature sets are compared, showing that features derived from the Lyon cochlear model give better recognition results in comparison with those obtained with the traditional MFCC representation.
Facial and prosodic emotion recognition in social anxiety disorder.
Tseng, Huai-Hsuan; Huang, Yu-Lien; Chen, Jian-Ting; Liang, Kuei-Yu; Lin, Chao-Cheng; Chen, Sue-Huei
2017-07-01
Patients with social anxiety disorder (SAD) have a cognitive preference to negatively evaluate emotional information. In particular, the preferential biases in prosodic emotion recognition in SAD have been much less explored. The present study aims to investigate whether SAD patients retain negative evaluation biases across visual and auditory modalities when given sufficient response time to recognise emotions. Thirty-one SAD patients and 31 age- and gender-matched healthy participants completed a culturally suitable non-verbal emotion recognition task and received clinical assessments for social anxiety and depressive symptoms. A repeated measures analysis of variance was conducted to examine group differences in emotion recognition. Compared to healthy participants, SAD patients were significantly less accurate at recognising facial and prosodic emotions, and spent more time on emotion recognition. The differences were mainly driven by the lower accuracy and longer reaction times for recognising fearful emotions in SAD patients. Within the SAD patients, lower accuracy of sad face recognition was associated with higher severity of depressive and social anxiety symptoms, particularly with avoidance symptoms. These findings may represent a cross-modality pattern of avoidance in the later stage of identifying negative emotions in SAD. This pattern may be linked to clinical symptom severity.
Rupp, Claudia I; Derntl, Birgit; Osthaus, Friederike; Kemmler, Georg; Fleischhacker, W Wolfgang
2017-12-01
Despite growing evidence for neurobehavioral deficits in social cognition in alcohol use disorder (AUD), the clinical relevance remains unclear, and little is known about its impact on treatment outcome. This study prospectively investigated the impact of neurocognitive social abilities at treatment onset on treatment completion. Fifty-nine alcohol-dependent patients were assessed with measures of social cognition including 3 core components of empathy via paradigms measuring: (i) emotion recognition (the ability to recognize emotions via facial expression), (ii) emotional perspective taking, and (iii) affective responsiveness at the beginning of inpatient treatment for alcohol dependence. Subjective measures were also obtained, including estimates of task performance and a self-report measure of empathic abilities (Interpersonal Reactivity Index). According to treatment outcomes, patients were divided into a patient group with a regular treatment course (e.g., with planned discharge and without relapse during treatment) or an irregular treatment course (e.g., relapse and/or premature and unplanned termination of treatment, "dropout"). Compared with patients completing treatment in a regular fashion, patients with relapse and/or dropout of treatment had significantly poorer facial emotion recognition ability at treatment onset. Additional logistic regression analyses confirmed these results and identified poor emotion recognition performance as a significant predictor for relapse/dropout. Self-report (subjective) measures did not correspond with neurobehavioral social cognition measures, respectively objective task performance. Analyses of individual subtypes of facial emotions revealed poorer recognition particularly of disgust, anger, and no (neutral faces) emotion in patients with relapse/dropout. Social cognition in AUD is clinically relevant. Less successful treatment outcome was associated with poorer facial emotion recognition ability at the beginning of treatment. Impaired facial emotion recognition represents a neurocognitive risk factor that should be taken into account in alcohol dependence treatment. Treatments targeting the improvement of these social cognition deficits in AUD may offer a promising future approach. Copyright © 2017 by the Research Society on Alcoholism.
Mohan, S N; Mukhtar, F; Jobson, L
2016-01-01
Introduction Depression is a mood disorder that affects a significant proportion of the population worldwide. In Malaysia and Australia, the number of people diagnosed with depression is on the rise. It has been found that impairments in emotion processing and emotion regulation play a role in the development and maintenance of depression. This study is based on Matsumoto and Hwang's biocultural model of emotion and Triandis' Subjective Culture model. It aims to investigate the influence of culture on emotion processing among Malaysians and Australians with and without major depressive disorder (MDD). Methods and analysis This study will adopt a between-group design. Participants will include Malaysian Malays and Caucasian Australians with and without MDD (N=320). There will be four tasks involved in this study, namely: (1) the facial emotion recognition task, (2) the biological motion task, (3) the subjective experience task and (4) the emotion meaning task. It is hypothesised that there will be cultural differences in how participants with and without MDD respond to these emotion tasks and that, pan-culturally, MDD will influence accuracy rates in the facial emotion recognition task and the biological motion task. Ethics and dissemination This study is approved by the Universiti Putra Malaysia Research Ethics Committee (JKEUPM) and the Monash University Human Research Ethics Committee (MUHREC). Permission to conduct the study has also been obtained from the National Medical Research Register (NMRR; NMRR-15-2314-26919). On completion of the study, data will be kept by Universiti Putra Malaysia for a specific period of time before they are destroyed. Data will be published in a collective manner in the form of journal articles with no reference to a specific individual. PMID:27798019
Multisensory emotion perception in congenitally, early, and late deaf CI users
Nava, Elena; Villwock, Agnes K.; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte
2017-01-01
Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences. PMID:29023525
Multisensory emotion perception in congenitally, early, and late deaf CI users.
Fengler, Ineke; Nava, Elena; Villwock, Agnes K; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte
2017-01-01
Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences.
Oxytocin improves emotion recognition for older males.
Campbell, Anna; Ruffman, Ted; Murray, Janice E; Glue, Paul
2014-10-01
Older adults (≥60 years) perform worse than young adults (18-30 years) when recognizing facial expressions of emotion. The hypothesized cause of these changes might be declines in neurotransmitters that could affect information processing within the brain. In the present study, we examined the neuropeptide oxytocin that functions to increase neurotransmission. Research suggests that oxytocin benefits the emotion recognition of less socially able individuals. Men tend to have lower levels of oxytocin and older men tend to have worse emotion recognition than older women; therefore, there is reason to think that older men will be particularly likely to benefit from oxytocin. We examined this idea using a double-blind design, testing 68 older and 68 young adults randomly allocated to receive oxytocin nasal spray (20 international units) or placebo. Forty-five minutes afterward they completed an emotion recognition task assessing labeling accuracy for angry, disgusted, fearful, happy, neutral, and sad faces. Older males receiving oxytocin showed improved emotion recognition relative to those taking placebo. No differences were found for older females or young adults. We hypothesize that oxytocin facilitates emotion recognition by improving neurotransmission in the group with the worst emotion recognition. Copyright © 2014 Elsevier Inc. All rights reserved.
Ferrucci, Roberta; Giannicola, Gaia; Rosa, Manuela; Fumagalli, Manuela; Boggio, Paulo Sergio; Hallett, Mark; Zago, Stefano; Priori, Alberto
2012-01-01
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.
Morosan, Larisa; Badoud, Deborah; Zaharia, Alexandra; Brosch, Tobias; Eliez, Stephan; Bateman, Anthony; Heller, Patrick; Debbané, Martin
2017-01-01
Background Previous research suggests that antisocial individuals present impairment in social cognitive processing, more specifically in emotion recognition (ER) and perspective taking (PT). The first aim of the present study was to investigate the recognition of a wide range of emotional expressions and visual PT capacities in a group of incarcerated male adolescents in comparison to a matched group of community adolescents. Secondly, we sought to explore the relationship between these two mechanisms in relation to psychopathic traits. Methods Forty-five male adolescents (22 incarcerated adolescents (Mage = 16.52, SD = 0.96) and 23 community adolescents (Mage = 16.43, SD = 1.41)) participated in the study. ER abilities were measured using a dynamic and multimodal task that requires the participants to watch short videos in which trained actors express 14 emotions. PT capacities were examined using a task recognized and proven to be sensitive to adolescent development, where participants had to follow the directions of another person whilst taking into consideration his perspective. Results We found a main effect of group on emotion recognition scores. In comparison to the community adolescents, the incarcerated adolescents presented lower recognition of three emotions: interest, anxiety and amusement. Analyses also revealed significant impairments in PT capacities in incarcerated adolescents. In addition, incarcerated adolescents’ PT scores were uniquely correlated to their scores on recognition of interest. Conclusions The results corroborate previously reported impairments in ER and PT capacities, in the incarcerated adolescents. The study also indicates an association between impairments in the recognition of interest and impairments in PT. PMID:28122048
Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C
2016-09-01
The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.
Processing of Facial Emotion in Bipolar Depression and Euthymia.
Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter
2015-10-01
Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.
Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue
2009-06-15
Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.
Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.
Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina
2017-01-01
Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.
Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.
Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted
2017-07-01
Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.
Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong
2016-01-01
This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.
Sawyer, Alyssa C P; Williamson, Paul; Young, Robyn L
2012-04-01
Research has shown that individuals with Autism Spectrum Disorders (ASD) have difficulties recognising emotions from facial expressions. Since eye contact is important for accurate emotion recognition, and individuals with ASD tend to avoid eye contact, this tendency for gaze aversion has been proposed as an explanation for the emotion recognition deficit. This explanation was investigated using a newly developed emotion and mental state recognition task. Individuals with Asperger's Syndrome were less accurate at recognising emotions and mental states, but did not show evidence of gaze avoidance compared to individuals without Asperger's Syndrome. This suggests that the way individuals with Asperger's Syndrome look at faces cannot account for the difficulty they have recognising expressions.
Face Processing and Facial Emotion Recognition in Adults with Down Syndrome
ERIC Educational Resources Information Center
Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial
2008-01-01
Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…
Communication Skills Training Exploiting Multimodal Emotion Recognition
ERIC Educational Resources Information Center
Bahreini, Kiavash; Nadolski, Rob; Westera, Wim
2017-01-01
The teaching of communication skills is a labour-intensive task because of the detailed feedback that should be given to learners during their prolonged practice. This study investigates to what extent our FILTWAM facial and vocal emotion recognition software can be used for improving a serious game (the Communication Advisor) that delivers a…
Laurent, Agathe; Arzimanoglou, Alexis; Panagiotakaki, Eleni; Sfaello, Ignacio; Kahane, Philippe; Ryvlin, Philippe; Hirsch, Edouard; de Schonen, Scania
2014-12-01
A high rate of abnormal social behavioural traits or perceptual deficits is observed in children with unilateral temporal lobe epilepsy. In the present study, perception of auditory and visual social signals, carried by faces and voices, was evaluated in children or adolescents with temporal lobe epilepsy. We prospectively investigated a sample of 62 children with focal non-idiopathic epilepsy early in the course of the disorder. The present analysis included 39 children with a confirmed diagnosis of temporal lobe epilepsy. Control participants (72), distributed across 10 age groups, served as a control group. Our socio-perceptual evaluation protocol comprised three socio-visual tasks (face identity, facial emotion and gaze direction recognition), two socio-auditory tasks (voice identity and emotional prosody recognition), and three control tasks (lip reading, geometrical pattern and linguistic intonation recognition). All 39 patients also benefited from a neuropsychological examination. As a group, children with temporal lobe epilepsy performed at a significantly lower level compared to the control group with regards to recognition of facial identity, direction of eye gaze, and emotional facial expressions. We found no relationship between the type of visual deficit and age at first seizure, duration of epilepsy, or the epilepsy-affected cerebral hemisphere. Deficits in socio-perceptual tasks could be found independently of the presence of deficits in visual or auditory episodic memory, visual non-facial pattern processing (control tasks), or speech perception. A normal FSIQ did not exempt some of the patients from an underlying deficit in some of the socio-perceptual tasks. Temporal lobe epilepsy not only impairs development of emotion recognition, but can also impair development of perception of other socio-perceptual signals in children with or without intellectual deficiency. Prospective studies need to be designed to evaluate the results of appropriate re-education programs in children presenting with deficits in social cue processing.
Impairment in the recognition of emotion across different media following traumatic brain injury.
Williams, Claire; Wood, Rodger Ll
2010-02-01
The current study examined emotion recognition following traumatic brain injury (TBI) and examined whether performance differed according to the affective valence and type of media presentation of the stimuli. A total of 64 patients with TBI and matched controls completed the Emotion Evaluation Test (EET) and Ekman 60 Faces Test (E-60-FT). Patients with TBI also completed measures of information processing and verbal ability. Results revealed that the TBI group were significantly impaired compared to controls when recognizing emotion on the EET and E-60-FT. A significant main effect of valence was found in both groups, with poor recognition of negative emotions. However, the difference between the recognition of positive and negative emotions was larger in the TBI group. The TBI group were also more accurate recognizing emotion displayed in audiovisual media (EET) than that displayed in still media (E-60-FT). No significant relationship was obtained between emotion recognition tasks and information-processing speed. A significant positive relationship was found between the E-60-FT and one measure of verbal ability. These findings support models of emotion that specify separate neurological pathways for certain emotions and different media and confirm that patients with TBI are vulnerable to experiencing emotion recognition difficulties.
Campanella, Fabio; Shallice, Tim; Ius, Tamara; Fabbro, Franco; Skrap, Miran
2014-09-01
Patients affected by brain tumours may show behavioural and emotional regulation deficits, sometimes showing flattened affect and sometimes experiencing a true 'change' in personality. However, little evidence is available to the surgeon as to what changes are likely to occur with damage at specific sites, as previous studies have either relied on single cases or provided only limited anatomical specificity, mostly reporting associations rather than dissociations of symptoms. We investigated these aspects in patients undergoing surgery for the removal of cerebral tumours. We argued that many of the problems described can be ascribed to the onset of difficulties in one or more of the different levels of the process of mentalizing (i.e. abstracting and reflecting upon) emotion and intentions, which impacts on everyday behaviour. These were investigated in terms of (i) emotion recognition; (ii) Theory of Mind; (iii) alexithymia; and (iv) self-maturity (personality disorder). We hypothesized that temporo/limbic areas would be critical for processing emotion and intentions at a more perceptual level, while frontal lobe structures would be more critical when higher levels of mentalization/abstraction are required. We administered four different tasks, Task 1: emotion recognition of Ekman faces; Task 2: the Eyes Test (Theory of Mind); Task 3: Toronto Alexithymia Scale; and Task 4: Temperament and Character Inventory (a personality inventory), both immediately before and few days after the operation for the removal of brain tumours in a series of 71 patients (age range: 18-75 years; 33 female) with lesions located in the left or right frontal, temporal and parietal lobes. Lobe-based and voxel-based analysis confirmed that tasks requiring interpretation of emotions and intentions at more basic (less mentalized) levels (Tasks 1 and 2) were more affected by temporo/insular lesions, with emotion recognition (Task 1) being maximally impaired by anterior temporal and amygdala lesions and Task 2 (found to be a 'basic' Theory of Mind task involving only limited mentalization) being mostly impaired by posterior temporoparietal lesions. Tasks relying on higher-level mentalization (Tasks 3 and 4) were maximally affected by prefrontal lesions, with the alexithymia scale (Task 3) being mostly associated with anterior/medial lesions and the self-maturity measure (Task 4) with lateral prefrontal ones. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Age differences in right-wing authoritarianism and their relation to emotion recognition.
Ruffman, Ted; Wilson, Marc; Henry, Julie D; Dawson, Abigail; Chen, Yan; Kladnitski, Natalie; Myftari, Ella; Murray, Janice; Halberstadt, Jamin; Hunter, John A
2016-03-01
This study examined the correlates of right-wing authoritarianism (RWA) in older adults. Participants were given tasks measuring emotion recognition, executive functions and fluid IQ and questionnaires measuring RWA, perceived threat and social dominance orientation. Study 1 established higher age-related RWA across the age span in more than 2,600 New Zealanders. Studies 2 to 4 found that threat, education, social dominance and age all predicted unique variance in older adults' RWA, but the most consistent predictor was emotion recognition, predicting unique variance in older adults' RWA independent of all other variables. We argue that older adults' worse emotion recognition is associated with a more general change in social judgment. Expression of extreme attitudes (right- or left-wing) has the potential to antagonize others, but worse emotion recognition means that subtle signals will not be perceived, making the expression of extreme attitudes more likely. Our findings are consistent with other studies showing that worsening emotion recognition underlies age-related declines in verbosity, understanding of social gaffes, and ability to detect lies. Such results indicate that emotion recognition is a core social insight linked to many aspects of social cognition. (c) 2016 APA, all rights reserved).
Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James
2017-01-01
Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives. PMID:28068393
Positive and negative emotion enhances the processing of famous faces in a semantic judgment task.
Bate, Sarah; Haslam, Catherine; Hodgson, Timothy L; Jansari, Ashok; Gregory, Nicola; Kay, Janice
2010-01-01
Previous work has consistently reported a facilitatory influence of positive emotion in face recognition (e.g., D'Argembeau, Van der Linden, Comblain, & Etienne, 2003). However, these reports asked participants to make recognition judgments in response to faces, and it is unknown whether emotional valence may influence other stages of processing, such as at the level of semantics. Furthermore, other evidence suggests that negative rather than positive emotion facilitates higher level judgments when processing nonfacial stimuli (e.g., Mickley & Kensinger, 2008), and it is possible that negative emotion also influences latter stages of face processing. The present study addressed this issue, examining the influence of emotional valence while participants made semantic judgments in response to a set of famous faces. Eye movements were monitored while participants performed this task, and analyses revealed a reduction in information extraction for the faces of liked and disliked celebrities compared with those of emotionally neutral celebrities. Thus, in contrast to work using familiarity judgments, both positive and negative emotion facilitated processing in this semantic-based task. This pattern of findings is discussed in relation to current models of face processing. Copyright 2009 APA, all rights reserved.
Chiu, Isabelle; Gfrörer, Regina I; Piguet, Olivier; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
2015-08-01
The importance of including measures of emotion processing, such as tests of facial emotion recognition (FER), as part of a comprehensive neuropsychological assessment is being increasingly recognized. In clinical settings, FER tests need to be sensitive, short, and easy to administer, given the limited time available and patient limitations. Current tests, however, commonly use stimuli that either display prototypical emotions, bearing the risk of ceiling effects and unequal task difficulty, or are cognitively too demanding and time-consuming. To overcome these limitations in FER testing in patient populations, we aimed to define FER threshold levels for the six basic emotions in healthy individuals. Forty-nine healthy individuals between 52 and 79 years of age were asked to identify the six basic emotions at different intensity levels (25%, 50%, 75%, 100%, and 125% of the prototypical emotion). Analyses uncovered differing threshold levels across emotions and sex of facial stimuli, ranging from 50% up to 100% intensities. Using these findings as "healthy population benchmarks", we propose to apply these threshold levels to clinical populations either as facial emotion recognition or intensity rating tasks. As part of any comprehensive social cognition test battery, this approach should allow for a rapid and sensitive assessment of potential FER deficits.
Limbrecht-Ecklundt, Kerstin; Scheck, Andreas; Jerg-Bretzke, Lucia; Walter, Steffen; Hoffmann, Holger; Traue, Harald C.
2013-01-01
Objective: This article includes the examination of potential methodological problems of the application of a forced choice response format in facial emotion recognition. Methodology: 33 subjects were presented with validated facial stimuli. The task was to make a decision about which emotion was shown. In addition, the subjective certainty concerning the decision was recorded. Results: The detection rates are 68% for fear, 81% for sadness, 85% for anger, 87% for surprise, 88% for disgust, and 94% for happiness, and are thus well above the random probability. Conclusion: This study refutes the concern that the use of forced choice formats may not adequately reflect actual recognition performance. The use of standardized tests to examine emotion recognition ability leads to valid results and can be used in different contexts. For example, the images presented here appear suitable for diagnosing deficits in emotion recognition in the context of psychological disorders and for mapping treatment progress. PMID:23798981
A facial expression of pax: Assessing children's "recognition" of emotion from faces.
Nelson, Nicole L; Russell, James A
2016-01-01
In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.
Impaired recognition of scary music following unilateral temporal lobe excision.
Gosselin, Nathalie; Peretz, Isabelle; Noulhiane, Marion; Hasboun, Dominique; Beckett, Christine; Baulac, Michel; Samson, Séverine
2005-03-01
Music constitutes an ideal means to create a sense of suspense in films. However, there has been minimal investigation into the underlying cerebral organization for perceiving danger created by music. In comparison, the amygdala's role in recognition of fear in non-musical contexts has been well established. The present study sought to fill this gap in exploring how patients with amygdala resection recognize emotional expression in music. To this aim, we tested 16 patients with left (LTR; n = 8) or right (RTR; n = 8) medial temporal resection (including amygdala) for the relief of medically intractable seizures and 16 matched controls in an emotion recognition task involving instrumental music. The musical selections were purposely created to induce fear, peacefulness, happiness and sadness. Participants were asked to rate to what extent each musical passage expressed these four emotions on 10-point scales. In order to check for the presence of a perceptual problem, the same musical selections were presented to the participants in an error detection task. None of the patients was found to perform below controls in the perceptual task. In contrast, both LTR and RTR patients were found to be impaired in the recognition of scary music. Recognition of happy and sad music was normal. These findings suggest that the anteromedial temporal lobe (including the amygdala) plays a role in the recognition of danger in a musical context.
Gamond, L; Cattaneo, Z
2016-12-01
Consistent evidence suggests that emotional facial expressions are better recognized when the expresser and the perceiver belong to the same social group (in-group advantage). In this study, we used transcranial magnetic stimulation (TMS) to investigate the possible causal involvement of the dorsomedial prefrontal cortex (dmPFC) and of the right temporo-parietal junction (TPJ), two main nodes of the mentalizing neural network, in mediating the in-group advantage in emotion recognition. Participants performed an emotion discrimination task in a minimal (blue/green) group paradigm. We found that interfering with activity in the dmPFC significantly interfered with the effect of minimal group-membership on emotion recognition, reducing participants' ability to discriminate emotions expressed by in-group members. In turn, rTPJ mainly affected emotion discrimination per se, irrespective of group membership. Overall, our results point to a causal role of the dmPFC in mediating the in-group advantage in emotion recognition, favoring intragroup communication. Copyright © 2016 Elsevier Ltd. All rights reserved.
Recognition of face identity and emotion in expressive specific language impairment.
Merkenschlager, A; Amorosa, H; Kiefl, H; Martinius, J
2012-01-01
To study face and emotion recognition in children with mostly expressive specific language impairment (SLI-E). A test movie to study perception and recognition of faces and mimic-gestural expression was applied to 24 children diagnosed as suffering from SLI-E and an age-matched control group of normally developing children. Compared to a normal control group, the SLI-E children scored significantly worse in both the face and expression recognition tasks with a preponderant effect on emotion recognition. The performance of the SLI-E group could not be explained by reduced attention during the test session. We conclude that SLI-E is associated with a deficiency in decoding non-verbal emotional facial and gestural information, which might lead to profound and persistent problems in social interaction and development. Copyright © 2012 S. Karger AG, Basel.
Barbato, Mariapaola; Liu, Lu; Cadenhead, Kristin S; Cannon, Tyrone D; Cornblatt, Barbara A; McGlashan, Thomas H; Perkins, Diana O; Seidman, Larry J; Tsuang, Ming T; Walker, Elaine F; Woods, Scott W; Bearden, Carrie E; Mathalon, Daniel H; Heinssen, Robert; Addington, Jean
2015-09-01
Social cognition, the mental operations that underlie social interactions, is a major construct to investigate in schizophrenia. Impairments in social cognition are present before the onset of psychosis, and even in unaffected first-degree relatives, suggesting that social cognition may be a trait marker of the illness. In a large cohort of individuals at clinical high risk for psychosis (CHR) and healthy controls, three domains of social cognition (theory of mind, facial emotion recognition and social perception) were assessed to clarify which domains are impaired in this population. Six-hundred and seventy-five CHR individuals and 264 controls, who were part of the multi-site North American Prodromal Longitudinal Study, completed The Awareness of Social Inference Test , the Penn Emotion Recognition task , the Penn Emotion Differentiation task , and the Relationship Across Domains , measures of theory of mind, facial emotion recognition, and social perception, respectively. Social cognition was not related to positive and negative symptom severity, but was associated with age and IQ. CHR individuals demonstrated poorer performance on all measures of social cognition. However, after controlling for age and IQ, the group differences remained significant for measures of theory of mind and social perception, but not for facial emotion recognition. Theory of mind and social perception are impaired in individuals at CHR for psychosis. Age and IQ seem to play an important role in the arising of deficits in facial affect recognition. Future studies should examine the stability of social cognition deficits over time and their role, if any, in the development of psychosis.
Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C
2014-01-01
The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS. PMID:24067301
Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C
2014-02-01
The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS.
Williams, Beth T; Gray, Kylie M; Tonge, Bruce J
2012-12-01
Children with autism have difficulties in emotion recognition and a number of interventions have been designed to target these problems. However, few emotion training interventions have been trialled with young children with autism and co-morbid ID. This study aimed to evaluate the efficacy of an emotion training programme for a group of young children with autism with a range of intellectual ability. Participants were 55 children with autistic disorder, aged 4-7 years (FSIQ 42-107). Children were randomly assigned to an intervention (n = 28) or control group (n = 27). Participants in the intervention group watched a DVD designed to teach emotion recognition skills to children with autism (the Transporters), whereas the control group watched a DVD of Thomas the Tank Engine. Participants were assessed on their ability to complete basic emotion recognition tasks, mindreading and theory of mind (TOM) tasks before and after the 4-week intervention period, and at 3-month follow-up. Analyses controlled for the effect of chronological age, verbal intelligence, gender and DVD viewing time on outcomes. Children in the intervention group showed improved performance in the recognition of anger compared with the control group, with few improvements maintained at 3-month follow-up. There was no generalisation of skills to TOM or social skills. The Transporters programme showed limited efficacy in teaching basic emotion recognition skills to young children with autism with a lower range of cognitive ability. Improvements were limited to the recognition of expressions of anger, with poor maintenance of these skills at follow-up. These findings provide limited support for the efficacy of the Transporters programme for young children with autism of a lower cognitive range. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.
Ipser, Jonathan C; Terburg, David; Syal, Supriya; Phillips, Nicole; Solms, Mark; Panksepp, Jaak; Malcolm-Smith, Susan; Thomas, Kevin; Stein, Dan J; van Honk, Jack
2013-01-01
In rodents, the endogenous opioid system has been implicated in emotion regulation, and in the reduction of fear in particular. In humans, while there is evidence that the opioid antagonist naloxone acutely enhances the acquisition of conditioned fear, there are no corresponding data on the effect of opioid agonists in moderating responses to fear. We investigated whether a single 0.2mg administration of the mu-opioid agonist buprenorphine would decrease fear sensitivity with an emotion-recognition paradigm. Healthy human subjects participated in a randomized placebo-controlled within-subject design, in which they performed a dynamic emotion recognition task 120min after administration of buprenorphine and placebo. In the recognition task, basic emotional expressions were morphed between their full expression and neutral in 2% steps, and presented as dynamic video-clips with final frames of different emotional intensity for each trial, which allows for a fine-grained measurement of emotion sensitivity. Additionally, visual analog scales were used to investigate acute effects of buprenorphine on mood. Compared to placebo, buprenorphine resulted in a significant reduction in the sensitivity for recognizing fearful facial expressions exclusively. Our data demonstrate, for the first time in humans, that acute up-regulation of the opioid system reduces fear recognition sensitivity. Moreover, the absence of an effect of buprenorphine on mood provides evidence of a direct influence of opioids upon the core fear system in the human brain. Copyright © 2012 Elsevier Ltd. All rights reserved.
Social Behavior and Impairments in Social Cognition Following Traumatic Brain Injury.
May, Michelle; Milders, Maarten; Downey, Bruce; Whyte, Maggie; Higgins, Vanessa; Wojcik, Zuzana; Amin, Sophie; O'Rourke, Suzanne
2017-05-01
The negative effect of changes in social behavior following traumatic brain injury (TBI) are known, but much less is known about the neuropsychological impairments that may underlie and predict these changes. The current study investigated possible associations between post-injury behavior and neuropsychological competencies of emotion recognition, understanding intentions, and response selection, that have been proposed as important for social functioning. Forty participants with TBI and 32 matched healthy participants completed a battery of tests assessing the three functions of interest. In addition, self- and proxy reports of pre- and post-injury behavior, mood, and community integration were collected. The TBI group performed significantly poorer than the comparison group on all tasks of emotion recognition, understanding intention, and on one task of response selection. Ratings of current behavior suggested significant changes in the TBI group relative to before the injury and showed significantly poorer community integration and interpersonal behavior than the comparison group. Of the three functions considered, emotion recognition was associated with both post-injury behavior and community integration and this association could not be fully explained by injury severity, time since injury, or education. The current study confirmed earlier findings of associations between emotion recognition and post-TBI behavior, providing partial evidence for models proposing emotion recognition as one of the pre-requisites for adequate social functioning. (JINS, 2017, 23, 400-411).
Golan, Ofer; Ashwin, Emma; Granader, Yael; McClintock, Suzy; Day, Kate; Leggett, Victoria; Baron-Cohen, Simon
2010-03-01
This study evaluated The Transporters, an animated series designed to enhance emotion comprehension in children with autism spectrum conditions (ASC). n = 20 children with ASC (aged 4-7) watched The Transporters everyday for 4 weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three levels of generalization. Two matched control groups of children (ASC group, n = 18 and typically developing group, n = 18) were also assessed twice without any intervention. The intervention group improved significantly more than the clinical control group on all task levels, performing comparably to typical controls at Time 2. We conclude that using The Transporters significantly improves emotion recognition in children with ASC. Future research should evaluate the series' effectiveness with lower-functioning individuals.
ERIC Educational Resources Information Center
Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars
2016-01-01
Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…
Theory of mind and recognition of facial emotion in dementia: challenge to current concepts.
Freedman, Morris; Binns, Malcolm A; Black, Sandra E; Murphy, Cara; Stuss, Donald T
2013-01-01
Current literature suggests that theory of mind (ToM) and recognition of facial emotion are impaired in behavioral variant frontotemporal dementia (bvFTD). In contrast, studies suggest that ToM is spared in Alzheimer disease (AD). However, there is controversy whether recognition of emotion in faces is impaired in AD. This study challenges the concepts that ToM is preserved in AD and that recognition of facial emotion is impaired in bvFTD. ToM, recognition of facial emotion, and identification of emotions associated with video vignettes were studied in bvFTD, AD, and normal controls. ToM was assessed using false-belief and visual perspective-taking tasks. Identification of facial emotion was tested using Ekman and Friesen's pictures of facial affect. After adjusting for relevant covariates, there were significant ToM deficits in bvFTD and AD compared with controls, whereas neither group was impaired in the identification of emotions associated with video vignettes. There was borderline impairment in recognizing angry faces in bvFTD. Patients with AD showed significant deficits on false belief and visual perspective taking, and bvFTD patients were impaired on second-order false belief. We report novel findings challenging the concepts that ToM is spared in AD and that recognition of facial emotion is impaired in bvFTD.
Tryptophan depletion decreases the recognition of fear in female volunteers.
Harmer, C J; Rogers, R D; Tunbridge, E; Cowen, P J; Goodwin, G M
2003-06-01
Serotonergic processes have been implicated in the modulation of fear conditioning in humans, postulated to occur at the level of the amygdala. The processing of other fear-relevant cues, such as facial expressions, has also been associated with amygdala function, but an effect of serotonin depletion on these processes has not been assessed. The present study investigated the effects of reducing serotonin function, using acute tryptophan depletion, on the recognition of basic facial expressions of emotions in healthy male and female volunteers. A double-blind between-groups design was used, with volunteers being randomly allocated to receive an amino acid drink specifically lacking tryptophan or a control mixture containing a balanced mixture of these amino acids. Participants were given a facial expression recognition task 5 h after drink administration. This task featured examples of six basic emotions (fear, anger, disgust, surprise, sadness and happiness) that had been morphed between each full emotion and neutral in 10% steps. As a control, volunteers were given a famous face classification task matched in terms of response selection and difficulty level. Tryptophan depletion significantly impaired the recognition of fearful facial expressions in female, but not male, volunteers. This was specific since recognition of other basic emotions was comparable in the two groups. There was also no effect of tryptophan depletion on the classification of famous faces or on subjective state ratings of mood or anxiety. These results confirm a role for serotonin in the processing of fear related cues, and in line with previous findings also suggest greater effects of tryptophan depletion in female volunteers. Although acute tryptophan depletion does not typically affect mood in healthy subjects, the present results suggest that subtle changes in the processing of emotional material may occur with this manipulation of serotonin function.
Sleep in Children Enhances Preferentially Emotional Declarative But Not Procedural Memories
ERIC Educational Resources Information Center
Prehn-Kristensen, Alexander; Goder, Robert; Chirobeja, Stefania; Bressman, Inka; Ferstl, Roman; Baving, Lioba
2009-01-01
Although the consolidation of several memory systems is enhanced by sleep in adults, recent studies suggest that sleep supports declarative memory but not procedural memory in children. In the current study, the influence of sleep on emotional declarative memory (recognition task) and procedural memory (mirror tracing task) in 20 healthy children…
Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk
2017-07-01
This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.
Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick
2014-11-01
Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.
Martin-Key, N; Brown, T; Fairchild, G
2017-10-01
Adolescents with disruptive behavior disorders are reported to show deficits in empathy and emotion recognition. However, prior studies have mainly used questionnaires to measure empathy or experimental paradigms that are lacking in ecological validity. We used an empathic accuracy (EA) task to study EA, emotion recognition, and affective empathy in 77 male adolescents aged 13-18 years: 37 with Conduct Disorder (CD) and 40 typically-developing controls. The CD sample was divided into higher callous-emotional traits (CD/CU+) and lower callous-unemotional traits (CD/CU-) subgroups using a median split. Participants watched films of actors recalling happy, sad, surprised, angry, disgusted or fearful autobiographical experiences and provided continuous ratings of emotional intensity (assessing EA), as well as naming the emotion (recognition) and reporting the emotion they experienced themselves (affective empathy). The CD and typically-developing groups did not significantly differ in EA and there were also no differences between the CD/CU+ and CD/CU- subgroups. Participants with CD were significantly less accurate than controls in recognizing sadness, fear, and disgust, all ps < 0.050, rs ≥ 0.30, whilst the CD/CU- and CD/CU+ subgroups did not differ in emotion recognition. Participants with CD also showed affective empathy deficits for sadness, fear, and disgust relative to controls, all ps < 0.010, rs ≥ 0.33, whereas the CD/CU+ and CD/CU- subgroups did not differ in affective empathy. These results extend prior research by demonstrating affective empathy and emotion recognition deficits in adolescents with CD using a more ecologically-valid task, and challenge the view that affective empathy deficits are specific to CD/CU+.
White, Corey N.; Kapucu, Aycan; Bruno, Davide; Rotello, Caren M.; Ratcliff, Roger
2014-01-01
Recognition memory studies often find that emotional items are more likely than neutral items to be labeled as studied. Previous work suggests this bias is driven by increased memory strength/familiarity for emotional items. We explored strength and bias interpretations of this effect with the conjecture that emotional stimuli might seem more familiar because they share features with studied items from the same category. Categorical effects were manipulated in a recognition task by presenting lists with a small, medium, or large proportion of emotional words. The liberal memory bias for emotional words was only observed when a medium or large proportion of categorized words were presented in the lists. Similar, though weaker, effects were observed with categorized words that were not emotional (animal names). These results suggest that liberal memory bias for emotional items may be largely driven by effects of category membership. PMID:24303902
White, Corey N; Kapucu, Aycan; Bruno, Davide; Rotello, Caren M; Ratcliff, Roger
2014-01-01
Recognition memory studies often find that emotional items are more likely than neutral items to be labelled as studied. Previous work suggests this bias is driven by increased memory strength/familiarity for emotional items. We explored strength and bias interpretations of this effect with the conjecture that emotional stimuli might seem more familiar because they share features with studied items from the same category. Categorical effects were manipulated in a recognition task by presenting lists with a small, medium or large proportion of emotional words. The liberal memory bias for emotional words was only observed when a medium or large proportion of categorised words were presented in the lists. Similar, though weaker, effects were observed with categorised words that were not emotional (animal names). These results suggest that liberal memory bias for emotional items may be largely driven by effects of category membership.
Morgenthaler, Jarste; Wiesner, Christian D; Hinze, Karoline; Abels, Lena C; Prehn-Kristensen, Alexander; Göder, Robert
2014-01-01
Sleep enhances memory consolidation and it has been hypothesized that rapid eye movement (REM) sleep in particular facilitates the consolidation of emotional memory. The aim of this study was to investigate this hypothesis using selective REM-sleep deprivation. We used a recognition memory task in which participants were shown negative and neutral pictures. Participants (N=29 healthy medical students) were separated into two groups (undisturbed sleep and selective REM-sleep deprived). Both groups also worked on the memory task in a wake condition. Recognition accuracy was significantly better for negative than for neutral stimuli and better after the sleep than the wake condition. There was, however, no difference in the recognition accuracy (neutral and emotional) between the groups. In summary, our data suggest that REM-sleep deprivation was successful and that the resulting reduction of REM-sleep had no influence on memory consolidation whatsoever.
Electroconvulsive therapy regulates emotional memory bias of depressed patients.
Bai, Tongjian; Xie, Wen; Wei, Qiang; Chen, Yang; Mu, Jingjing; Tian, Yanghua; Wang, Kai
2017-11-01
Emotional memory bias is considered to be an important base of the etiology of depression and can be reversed by antidepressants via enhancing the memory for positive stimuli. Another antidepressant treatment, electroconvulsive therapy (ECT), has rapid antidepressant effect and frequently causes short-term memory impairment. However, it is unclear about the short-term effect of ECT on memory bias. In this study, the incidental memory task with emotional pictures were applied to evaluate the emotional memory of twenty depressed patients at pre- and post-ECT (three days after ECT) compared to twenty healthy controls. The depressive symptoms were evaluated using the Hamilton rating scale of depression (HRSD). Before ECT, patients showed decreased recognition memory for positive pictures compared to controls and remembered negative pictures more easily than positive pictures in the recognition task. In patients, the main effect of session (pre-ECT and post-ECT) was significant for both recognition and recall memory with reduced memory performance. The interaction between valence (positive, neutral and negative) and session was significant for recognition memory, indicating that negative memory was impaired more severely than positive memory. Our study indicates that ECT relieves depressive symptoms and regulates emotional memory through more severe impairment on memory for negative stimuli. Copyright © 2017. Published by Elsevier B.V.
Lima, César F; Garrett, Carolina; Castro, São Luís
2013-01-01
Does emotion processing in music and speech prosody recruit common neurocognitive mechanisms? To examine this question, we implemented a cross-domain comparative design in Parkinson's disease (PD). Twenty-four patients and 25 controls performed emotion recognition tasks for music and spoken sentences. In music, patients had impaired recognition of happiness and peacefulness, and intact recognition of sadness and fear; this pattern was independent of general cognitive and perceptual abilities. In speech, patients had a small global impairment, which was significantly mediated by executive dysfunction. Hence, PD affected differently musical and prosodic emotions. This dissociation indicates that the mechanisms underlying the two domains are partly independent.
Lysaker, Paul H; Leonhardt, Bethany L; Brüne, Martin; Buck, Kelly D; James, Alison; Vohs, Jenifer; Francis, Michael; Hamm, Jay A; Salvatore, Giampaolo; Ringer, Jamie M; Dimaggio, Giancarlo
2014-09-30
While many with schizophrenia spectrum disorders experience difficulties understanding the feelings of others, little is known about the psychological antecedents of these deficits. To explore these issues we examined whether deficits in mental state decoding, mental state reasoning and metacognitive capacity predict performance on an emotion recognition task. Participants were 115 adults with a schizophrenia spectrum disorder and 58 adults with substance use disorders but no history of a diagnosis of psychosis who completed the Eyes and Hinting Test. Metacognitive capacity was assessed using the Metacognitive Assessment Scale Abbreviated and emotion recognition was assessed using the Bell Lysaker Emotion Recognition Test. Results revealed that the schizophrenia patients performed more poorly than controls on tests of emotion recognition, mental state decoding, mental state reasoning and metacognition. Lesser capacities for mental state decoding, mental state reasoning and metacognition were all uniquely related emotion recognition within the schizophrenia group even after controlling for neurocognition and symptoms in a stepwise multiple regression. Results suggest that deficits in emotion recognition in schizophrenia may partly result from a combination of impairments in the ability to judge the cognitive and affective states of others and difficulties forming complex representations of self and others. Published by Elsevier Ireland Ltd.
Cued uncertainty modulates later recognition of emotional pictures: An ERP study.
Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua
2017-06-01
Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.
Mohan, S N; Mukhtar, F; Jobson, L
2016-10-21
Depression is a mood disorder that affects a significant proportion of the population worldwide. In Malaysia and Australia, the number of people diagnosed with depression is on the rise. It has been found that impairments in emotion processing and emotion regulation play a role in the development and maintenance of depression. This study is based on Matsumoto and Hwang's biocultural model of emotion and Triandis' Subjective Culture model. It aims to investigate the influence of culture on emotion processing among Malaysians and Australians with and without major depressive disorder (MDD). This study will adopt a between-group design. Participants will include Malaysian Malays and Caucasian Australians with and without MDD (N=320). There will be four tasks involved in this study, namely: (1) the facial emotion recognition task, (2) the biological motion task, (3) the subjective experience task and (4) the emotion meaning task. It is hypothesised that there will be cultural differences in how participants with and without MDD respond to these emotion tasks and that, pan-culturally, MDD will influence accuracy rates in the facial emotion recognition task and the biological motion task. This study is approved by the Universiti Putra Malaysia Research Ethics Committee (JKEUPM) and the Monash University Human Research Ethics Committee (MUHREC). Permission to conduct the study has also been obtained from the National Medical Research Register (NMRR; NMRR-15-2314-26919). On completion of the study, data will be kept by Universiti Putra Malaysia for a specific period of time before they are destroyed. Data will be published in a collective manner in the form of journal articles with no reference to a specific individual. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Differential effects of emotional cues on components of prospective memory: an ERP study
Cona, Giorgia; Kliegel, Matthias; Bisiacchi, Patrizia S.
2015-01-01
So far, little is known about the neurocognitive mechanisms associated with emotion effects on prospective memory (PM) performance. Thus, this study aimed at disentangling possible mechanisms for the effects of emotional valence of PM cues on the distinct phases composing PM by investigating event-related potentials (ERPs). Participants were engaged in an ongoing N-back task while being required to perform a PM task. The emotional valence of both the ongoing pictures and the PM cues was manipulated (pleasant, neutral, unpleasant). ERPs were recorded during the PM phases, such as encoding, maintenance, and retrieval of the intention. A recognition task including PM cues and ongoing stimuli was also performed at the end of the sessions. ERP results suggest that emotional PM cues not only trigger an automatic, bottom-up, capture of attention, but also boost a greater allocation of top-down processes. These processes seem to be recruited to hold attention toward the emotional stimuli and to retrieve the intention from memory, likely because of the motivational significance of the emotional stimuli. Moreover, pleasant PM cues seemed to modulate especially the prospective component, as revealed by changes in the amplitude of the ERP correlates of strategic monitoring as a function of the relevance of the valence for the PM task. Unpleasant pictures seemed to modulate especially the retrospective component, as revealed by the largest old/new effect being elicited by unpleasant PM pictures in the recognition task. PMID:25674061
Breastfeeding experience differentially impacts recognition of happiness and anger in mothers.
Krol, Kathleen M; Kamboj, Sunjeev K; Curran, H Valerie; Grossmann, Tobias
2014-11-12
Breastfeeding is a dynamic biological and social process based on hormonal regulation involving oxytocin. While there is much work on the role of breastfeeding in infant development and on the role of oxytocin in socio-emotional functioning in adults, little is known about how breastfeeding impacts emotion perception during motherhood. We therefore examined whether breastfeeding influences emotion recognition in mothers. Using a dynamic emotion recognition task, we found that longer durations of exclusive breastfeeding were associated with faster recognition of happiness, providing evidence for a facilitation of processing positive facial expressions. In addition, we found that greater amounts of breastfed meals per day were associated with slower recognition of anger. Our findings are in line with current views of oxytocin function and support accounts that view maternal behaviour as tuned to prosocial responsiveness, by showing that vital elements of maternal care can facilitate the rapid responding to affiliative stimuli by reducing importance of threatening stimuli.
Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S
2017-08-01
Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.
Berggren, Nick; Richards, Anne; Taylor, Joseph; Derakshan, Nazanin
2013-01-01
Trait anxiety is associated with deficits in attentional control, particularly in the ability to inhibit prepotent responses. Here, we investigated this effect while varying the level of cognitive load in a modified antisaccade task that employed emotional facial expressions (neutral, happy, and angry) as targets. Load was manipulated using a secondary auditory task requiring recognition of tones (low load), or recognition of specific tone pitch (high load). Results showed that load increased antisaccade latencies on trials where gaze toward face stimuli should be inhibited. This effect was exacerbated for high anxious individuals. Emotional expression also modulated task performance on antisaccade trials for both high and low anxious participants under low cognitive load, but did not influence performance under high load. Collectively, results (1) suggest that individuals reporting high levels of anxiety are particularly vulnerable to the effects of cognitive load on inhibition, and (2) support recent evidence that loading cognitive processes can reduce emotional influences on attention and cognition. PMID:23717273
ERIC Educational Resources Information Center
Doody, John P.; Bull, Peter
2013-01-01
While most studies of emotion recognition in Asperger's Syndrome (AS) have focused solely on the verbal decoding of affective states, the current research employed the novel technique of using both nonverbal matching and verbal labeling tasks to examine the decoding of emotional body postures and facial expressions. AS participants performed…
Fink, Elian; de Rosnay, Marc; Wierda, Marlies; Koot, Hans M; Begeer, Sander
2014-09-01
The empirical literature has presented inconsistent evidence for deficits in the recognition of basic emotion expressions in children with autism spectrum disorders (ASD), which may be due to the focus on research with relatively small sample sizes. Additionally, it is proposed that although children with ASD may correctly identify emotion expression they rely on more deliberate, more time-consuming strategies in order to accurately recognize emotion expressions when compared to typically developing children. In the current study, we examine both emotion recognition accuracy and response time in a large sample of children, and explore the moderating influence of verbal ability on these findings. The sample consisted of 86 children with ASD (M age = 10.65) and 114 typically developing children (M age = 10.32) between 7 and 13 years of age. All children completed a pre-test (emotion word-word matching), and test phase consisting of basic emotion recognition, whereby they were required to match a target emotion expression to the correct emotion word; accuracy and response time were recorded. Verbal IQ was controlled for in the analyses. We found no evidence of a systematic deficit in emotion recognition accuracy or response time for children with ASD, controlling for verbal ability. However, when controlling for children's accuracy in word-word matching, children with ASD had significantly lower emotion recognition accuracy when compared to typically developing children. The findings suggest that the social impairments observed in children with ASD are not the result of marked deficits in basic emotion recognition accuracy or longer response times. However, children with ASD may be relying on other perceptual skills (such as advanced word-word matching) to complete emotion recognition tasks at a similar level as typically developing children.
Lopez-Duran, Nestor L.; Kuhlman, Kate R.; George, Charles; Kovacs, Maria
2012-01-01
In the present study we examined perceptual sensitivity to facial expressions of sadness among children at familial-risk for depression (N = 64) and low-risk peers (N = 40) between the ages 7 and 13(Mage = 9.51; SD = 2.27). Participants were presented with pictures of facial expressions that varied in emotional intensity from neutral to full-intensity sadness or anger (i.e., emotion recognition), or pictures of faces morphing from anger to sadness (emotion discrimination). After each picture was presented, children indicated whether the face showed a specific emotion (i.e., sadness, anger) or no emotion at all (neutral). In the emotion recognition task, boys (but not girls) at familial-risk for depression identified sadness at significantly lower levels of emotional intensity than did their low-risk peers. The high and low-risk groups did not differ with regard to identification of anger. In the emotion discrimination task, both groups displayed over-identification of sadness in ambiguous mixed faces but high-risk youth were less likely to show this labeling bias than their peers. Our findings are consistent with the hypothesis that enhanced perceptual sensitivity to subtle traces of sadness in facial expressions may be a potential mechanism of risk among boys at familial-risk for depression. This enhanced perceptual sensitivity does not appear to be due to biases in the labeling of ambiguous faces. PMID:23106941
Mood-congruent false memories persist over time.
Knott, Lauren M; Thorley, Craig
2014-01-01
In this study, we examined the role of mood-congruency and retention interval on the false recognition of emotion laden items using the Deese/Roediger-McDermott (DRM) paradigm. Previous research has shown a mood-congruent false memory enhancement during immediate recognition tasks. The present study examined the persistence of this effect following a one-week delay. Participants were placed in a negative or neutral mood, presented with negative-emotion and neutral-emotion DRM word lists, and administered with both immediate and delayed recognition tests. Results showed that a negative mood state increased remember judgments for negative-emotion critical lures, in comparison to neutral-emotion critical lures, on both immediate and delayed testing. These findings are discussed in relation to theories of spreading activation and emotion-enhanced memory, with consideration of the applied forensic implications of such findings.
Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone
2014-12-01
Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.
Sex differences in functional activation patterns revealed by increased emotion processing demands.
Hall, Geoffrey B C; Witelson, Sandra F; Szechtman, Henry; Nahmias, Claude
2004-02-09
Two [O(15)] PET studies assessed sex differences regional brain activation in the recognition of emotional stimuli. Study I revealed that the recognition of emotion in visual faces resulted in bilateral frontal activation in women, and unilateral right-sided activation in men. In study II, the complexity of the emotional face task was increased through tje addition of associated auditory emotional stimuli. Men again showed unilateral frontal activation, in this case to the left; whereas women did not show bilateral frontal activation, but showed greater limbic activity. These results suggest that when processing broader cross-modal emotional stimuli, men engage more in associative cognitive strategies while women draw more on primary emotional references.
McCubbin, James A; Loveless, James P; Graham, Jack G; Hall, Gabrielle A; Bart, Ryan M; Moore, DeWayne D; Merritt, Marcellus M; Lane, Richard D; Thayer, Julian F
2014-02-01
Persons with higher blood pressure have emotional dampening in some contexts. This may reflect interactive changes in central nervous system control of affect and autonomic function in the early stages of hypertension development. The purpose of this study is to determine the independence of cardiovascular emotional dampening from alexithymia to better understand the role of affect dysregulation in blood pressure elevations. Ninety-six normotensives were assessed for resting systolic and diastolic (DBP) blood pressure, recognition of emotions in faces and sentences using the Perception of Affect Task (PAT), alexithymia, anxiety, and defensiveness. Resting DBP significantly predicted PAT emotion recognition accuracy in men after adjustment for age, self-reported affect, and alexithymia. Cardiovascular emotional dampening is independent of alexithymia and affect in men. Dampened emotion recognition could potentially influence interpersonal communication and psychosocial distress, thereby further contributing to BP dysregulation and increased cardiovascular risk.
Inconsistent emotion recognition deficits across stimulus modalities in Huntington׳s disease.
Rees, Elin M; Farmer, Ruth; Cole, James H; Henley, Susie M D; Sprengelmeyer, Reiner; Frost, Chris; Scahill, Rachael I; Hobbs, Nicola Z; Tabrizi, Sarah J
2014-11-01
Recognition of negative emotions is impaired in Huntington׳s Disease (HD). It is unclear whether these emotion-specific problems are driven by dissociable cognitive deficits, emotion complexity, test cue difficulty, or visuoperceptual impairments. This study set out to further characterise emotion recognition in HD by comparing patterns of deficits across stimulus modalities; notably including for the first time in HD, the more ecologically and clinically relevant modality of film clips portraying dynamic facial expressions. Fifteen early HD and 17 control participants were tested on emotion recognition from static facial photographs, non-verbal vocal expressions and one second dynamic film clips, all depicting different emotions. Statistically significant evidence of impairment of anger, disgust and fear recognition was seen in HD participants compared with healthy controls across multiple stimulus modalities. The extent of the impairment, as measured by the difference in the number of errors made between HD participants and controls, differed according to the combination of emotion and modality (p=0.013, interaction test). The largest between-group difference was seen in the recognition of anger from film clips. Consistent with previous reports, anger, disgust and fear were the most poorly recognised emotions by the HD group. This impairment did not appear to be due to task demands or expression complexity as the pattern of between-group differences did not correspond to the pattern of errors made by either group; implicating emotion-specific cognitive processing pathology. There was however evidence that the extent of emotion recognition deficits significantly differed between stimulus modalities. The implications in terms of designing future tests of emotion recognition and care giving are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R
2006-06-01
Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.
Dolder, Patrick C; Holze, Friederike; Liakoni, Evangelia; Harder, Samuel; Schmid, Yasmin; Liechti, Matthias E
2017-01-01
Social cognition influences social interactions. Alcohol reportedly facilitates social interactions. However, the acute effects of alcohol on social cognition are relatively poorly studied. We investigated the effects of alcoholic or non-alcoholic beer on emotion recognition, empathy, and sexual arousal using the dynamic face emotion recognition task (FERT), Multifaceted Empathy Test (MET), and Sexual Arousal Task (SAT) in a double-blind, random-order, cross-over study in 60 healthy social drinkers. We also assessed subjective effects using visual analog scales (VASs), blood alcohol concentrations, and plasma oxytocin levels. Alcohol increased VAS ratings of stimulated, happy, talkative, open, and want to be with others. The subjective effects of alcohol were greater in participants with higher trait inhibitedness. Alcohol facilitated the recognition of happy faces on the FERT and enhanced emotional empathy for positive stimuli on the MET, particularly in participants with low trait empathy. Pictures of explicit sexual content were rated as less pleasant than neutral pictures after non-alcoholic beer but not after alcoholic beer. Explicit sexual pictures were rated as more pleasant after alcoholic beer compared with non-alcoholic beer, particularly in women. Alcohol did not alter the levels of circulating oxytocin. Alcohol biased emotion recognition toward better decoding of positive emotions and increased emotional concern for positive stimuli. No support was found for a modulatory role of oxytocin. Alcohol also facilitated the viewing of sexual images, consistent with disinhibition, but it did not actually enhance sexual arousal. These effects of alcohol on social cognition likely enhance sociability. www.clinicaltrials.gov/ct2/show/NCT02318823.
Effects of Power on Mental Rotation and Emotion Recognition in Women.
Nissan, Tali; Shapira, Oren; Liberman, Nira
2015-10-01
Based on construal-level theory (CLT) and its view of power as an instance of social distance, we predicted that high, relative to low power would enhance women's mental-rotation performance and impede their emotion-recognition performance. The predicted effects of power emerged both when it was manipulated via a recall priming task (Study 1) and environmental cues (Studies 2 and 3). Studies 3 and 4 found evidence for mediation by construal level of the effect of power on emotion recognition but not on mental rotation. We discuss potential mediating mechanisms for these effects based on both the social distance/construal level and the approach/inhibition views of power. We also discuss implications for optimizing performance on mental rotation and emotion recognition in everyday life. © 2015 by the Society for Personality and Social Psychology, Inc.
Patterns of Attachment and Emotional Competence in Middle Childhood
ERIC Educational Resources Information Center
Colle, Livia; Del Giudice, Marco
2011-01-01
The study investigated the relationship between patterns of attachment and emotional competence at the beginning of middle childhood in a sample of 122 seven-year-olds. A new battery of tasks was developed in order to assess two facets of emotional competence (emotion recognition and knowledge of regulation strategies). Attachment was related to…
Vongas, John G; Al Hajj, Raghid
2017-06-01
A contribution to a special issue on Hormones and Human Competition. We investigated the effects of competition on men's testosterone levels and assessed whether androgen reactivity was associated with subsequent emotion recognition and reactive and proactive aggression. We also explored whether personalized power (p Power) moderated these relationships. In Study 1, 84 males competed on a number tracing task and interpreted emotions from facial expressions. In Study 2, 72 males competed on the same task and were assessed on proactive and reactive aggression. In both studies, contrary to the biosocial model of status (Mazur, 1985), winners' testosterone levels decreased significantly while losers' levels increased, albeit not significantly. Personalized power moderated the effect of competition outcome on testosterone change in both studies. Using the aggregate sample, we found that the effect of decreased testosterone levels among winners (compared to losers) was significant for individuals low in p Power but not for those with medium or high p Power. Testosterone change was positively related to emotion recognition, but unrelated to either aggression subtype. The testosterone-mediated relationship between winning and losing and emotion recognition was moderated by p Power. In addition, p Power moderated the direct (i.e., non-testosterone mediated) path between competition outcome and emotion recognition and both types of aggression: high p-Power winners were more accurate at deciphering others' emotions than high p-Power losers. Finally, among high p-Power men, winners aggressed more proactively than losers, whereas losers aggressed more reactively than winners. Collectively, these studies highlight the importance of implicit power motivation in modulating hormonal, cognitive, and behavioral outcomes arising from human competition. Copyright © 2017 Elsevier Inc. All rights reserved.
How Mood and Task Complexity Affect Children's Recognition of Others’ Emotions
Cummings, Andrew J.; Rennels, Jennifer L.
2013-01-01
Previous studies examined how mood affects children's accuracy in matching emotional expressions and labels (label-based tasks). This study was the first to assess how induced mood (positive, neutral, or negative) influenced 5- to 8-year-olds’ accuracy and reaction time using both context-based tasks, which required inferring a character's emotion from a vignette, and label-based tasks. Both tasks required choosing one of four facial expressions to respond. Children responded more accurately to label-based questions relative to context-based questions at 5 to 7 years of age, but showed no differences at 8 years of age, and when the emotional expression being identified was happiness, sadness, or surprise, but not disgust. For the context-based questions, children were more accurate at inferring sad and disgusted emotions compared to happy and surprised emotions. Induced positive mood facilitated 5-year-olds’ processing (decreased reaction time) in both tasks compared to induced negative and neutral moods. Results demonstrate how task type and children's mood influence children's emotion processing at different ages. PMID:24489442
Emotion recognition ability in mothers at high and low risk for child physical abuse.
Balge, K A; Milner, J S
2000-10-01
The study sought to determine if high-risk, compared to low-risk, mothers make more emotion recognition errors when they attempt to recognize emotions in children and adults. Thirty-two demographically matched high-risk (n = 16) and low-risk (n = 16) mothers were asked to identify different emotions expressed by children and adults. Sets of high- and low-intensity, visual and auditory emotions were presented. Mothers also completed measures of stress, depression, and ego-strength. High-risk, compared to low-risk, mothers showed a tendency to make more errors on the visual and auditory emotion recognition tasks, with a trend toward more errors on the low-intensity, visual stimuli. However, the observed trends were not significant. Only a post-hoc test of error rates across all stimuli indicated that high-risk, compared to low-risk, mothers made significantly more emotion recognition errors. Although situational stress differences were not found, high-risk mothers reported significantly higher levels of general parenting stress and depression and lower levels of ego-strength. Since only trends and a significant post hoc finding of more overall emotion recognition errors in high-risk mothers were observed, additional research is needed to determine if high-risk mothers have emotion recognition deficits that may impact parent-child interactions. As in prior research, the study found that high-risk mothers reported more parenting stress and depression and less ego-strength.
Facial Expression Influences Face Identity Recognition During the Attentional Blink
2014-01-01
Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry—suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another. PMID:25286076
Facial expression influences face identity recognition during the attentional blink.
Bach, Dominik R; Schmidt-Daffy, Martin; Dolan, Raymond J
2014-12-01
Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry-suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another.
Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions.
Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio
2015-01-01
The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants' tendency to over-attribute anger label to other negative facial expressions. Participants' heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants' performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants' tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children's "pre-existing bias" for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim's perceptive and attentive focus on salient environmental social stimuli.
Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions
Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio
2015-01-01
The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants’ tendency to over-attribute anger label to other negative facial expressions. Participants’ heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants’ performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants’ tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children’s “pre-existing bias” for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim’s perceptive and attentive focus on salient environmental social stimuli. PMID:26509890
A voxel-based lesion study on facial emotion recognition after penetrating brain injury
Dal Monte, Olga; Solomon, Jeffrey M.; Schintu, Selene; Knutson, Kristine M.; Strenziok, Maren; Pardini, Matteo; Leopold, Anne; Raymont, Vanessa; Grafman, Jordan
2013-01-01
The ability to read emotions in the face of another person is an important social skill that can be impaired in subjects with traumatic brain injury (TBI). To determine the brain regions that modulate facial emotion recognition, we conducted a whole-brain analysis using a well-validated facial emotion recognition task and voxel-based lesion symptom mapping (VLSM) in a large sample of patients with focal penetrating TBIs (pTBIs). Our results revealed that individuals with pTBI performed significantly worse than normal controls in recognizing unpleasant emotions. VLSM mapping results showed that impairment in facial emotion recognition was due to damage in a bilateral fronto-temporo-limbic network, including medial prefrontal cortex (PFC), anterior cingulate cortex, left insula and temporal areas. Beside those common areas, damage to the bilateral and anterior regions of PFC led to impairment in recognizing unpleasant emotions, whereas bilateral posterior PFC and left temporal areas led to impairment in recognizing pleasant emotions. Our findings add empirical evidence that the ability to read pleasant and unpleasant emotions in other people's faces is a complex process involving not only a common network that includes bilateral fronto-temporo-limbic lobes, but also other regions depending on emotional valence. PMID:22496440
Castagna, Filomena; Montemagni, Cristiana; Maria Milani, Anna; Rocca, Giuseppe; Rocca, Paola; Casacchia, Massimo; Bogetto, Filippo
2013-02-28
This study aimed to evaluate the ability to decode emotion in the auditory and audiovisual modality in a group of patients with schizophrenia, and to explore the role of cognition and psychopathology in affecting these emotion recognition abilities. Ninety-four outpatients in a stable phase and 51 healthy subjects were recruited. Patients were assessed through a psychiatric evaluation and a wide neuropsychological battery. All subjects completed the comprehensive affect testing system (CATS), a group of computerized tests designed to evaluate emotion perception abilities. With respect to the controls, patients were not impaired in the CATS tasks involving discrimination of nonemotional prosody, naming of emotional stimuli expressed by voice and judging the emotional content of a sentence, whereas they showed a specific impairment in decoding emotion in a conflicting auditory condition and in the multichannel modality. Prosody impairment was affected by executive functions, attention and negative symptoms, while deficit in multisensory emotion recognition was affected by executive functions and negative symptoms. These emotion recognition deficits, rather than being associated purely with emotion perception disturbances in schizophrenia, are affected by core symptoms of the illness. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Yang, Chengqing; Zhang, Tianhong; Li, Zezhi; Heeramun-Aubeeluck, Anisha; Liu, Na; Huang, Nan; Zhang, Jie; He, Leiying; Li, Hui; Tang, Yingying; Chen, Fazhan; Liu, Fei; Wang, Jijun; Lu, Zheng
2015-10-08
Although many studies have examined executive functions and facial emotion recognition in people with schizophrenia, few of them focused on the correlation between them. Furthermore, their relationship in the siblings of patients also remains unclear. The aim of the present study is to examine the correlation between executive functions and facial emotion recognition in patients with first-episode schizophrenia and their siblings. Thirty patients with first-episode schizophrenia, their twenty-six siblings, and thirty healthy controls were enrolled. They completed facial emotion recognition tasks using the Ekman Standard Faces Database, and executive functioning was measured by Wisconsin Card Sorting Test (WCST). Hierarchical regression analysis was applied to assess the correlation between executive functions and facial emotion recognition. Our study found that in siblings, the accuracy in recognizing low degree 'disgust' emotion was negatively correlated with the total correct rate in WCST (r = -0.614, p = 0.023), but was positively correlated with the total error in WCST (r = 0.623, p = 0.020); the accuracy in recognizing 'neutral' emotion was positively correlated with the total error rate in WCST (r = 0.683, p = 0.014) while negatively correlated with the total correct rate in WCST (r = -0.677, p = 0.017). People with schizophrenia showed an impairment in facial emotion recognition when identifying moderate 'happy' facial emotion, the accuracy of which was significantly correlated with the number of completed categories of WCST (R(2) = 0.432, P < .05). There were no correlations between executive functions and facial emotion recognition in the healthy control group. Our study demonstrated that facial emotion recognition impairment correlated with executive function impairment in people with schizophrenia and their unaffected siblings but not in healthy controls.
Vieillard, Sandrine; Gilet, Anne-Laure
2013-01-01
There is mounting evidence that aging is associated with the maintenance of positive affect and the decrease of negative affect to ensure emotion regulation goals. Previous empirical studies have primarily focused on a visual or autobiographical form of emotion communication. To date, little investigation has been done on musical emotions. The few studies that have addressed aging and emotions in music were mainly interested in emotion recognition, thus leaving unexplored the question of how aging may influence emotional responses to and memory for emotions conveyed by music. In the present study, eighteen older (60–84 years) and eighteen younger (19–24 years) listeners were asked to evaluate the strength of their experienced emotion on happy, peaceful, sad, and scary musical excerpts (Vieillard et al., 2008) while facial muscle activity was recorded. Participants then performed an incidental recognition task followed by a task in which they judged to what extent they experienced happiness, peacefulness, sadness, and fear when listening to music. Compared to younger adults, older adults (a) reported a stronger emotional reactivity for happiness than other emotion categories, (b) showed an increased zygomatic activity for scary stimuli, (c) were more likely to falsely recognize happy music, and (d) showed a decrease in their responsiveness to sad and scary music. These results are in line with previous findings and extend them to emotion experience and memory recognition, corroborating the view of age-related changes in emotional responses to music in a positive direction away from negativity. PMID:24137141
Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean
2017-11-01
People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.
Voorthuis, Alexandra; Riem, Madelon M E; Van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J
2014-09-11
The neuropeptide oxytocin facilitates parental caregiving and is involved in the processing of infant vocal cues. In this randomized-controlled trial with functional magnetic resonance imaging we examined the influence of intranasally administered oxytocin on neural activity during emotion recognition in infant faces. Blood oxygenation level dependent (BOLD) responses during emotion recognition were measured in 50 women who were administered 16 IU of oxytocin or a placebo. Participants performed an adapted version of the Infant Facial Expressions of Emotions from Looking at Pictures (IFEEL pictures), a task that has been developed to assess the perception and interpretation of infants' facial expressions. Experimentally induced oxytocin levels increased activation in the inferior frontal gyrus (IFG), the middle temporal gyrus (MTG) and the superior temporal gyrus (STG). However, oxytocin decreased performance on the IFEEL picture task. Our findings suggest that oxytocin enhances processing of facial cues of the emotional state of infants on a neural level, but at the same time it may decrease the correct interpretation of infants' facial expressions on a behavior level. This article is part of a Special Issue entitled Oxytocin and Social Behav. © 2013 Published by Elsevier B.V.
Ziaei, Maryam; Peira, Nathalie; Persson, Jonas
2014-02-15
Goal-directed behavior requires that cognitive operations can be protected from emotional distraction induced by task-irrelevant emotional stimuli. The brain processes involved in attending to relevant information while filtering out irrelevant information are still largely unknown. To investigate the neural and behavioral underpinnings of attending to task-relevant emotional stimuli while ignoring irrelevant stimuli, we used fMRI to assess brain responses during attentional instructed encoding within an emotional working memory (WM) paradigm. We showed that instructed attention to emotion during WM encoding resulted in enhanced performance, by means of increased memory performance and reduced reaction time, compared to passive viewing. A similar performance benefit was also demonstrated for recognition memory performance, although for positive pictures only. Functional MRI data revealed a network of regions involved in directed attention to emotional information for both positive and negative pictures that included medial and lateral prefrontal cortices, fusiform gyrus, insula, the parahippocampal gyrus, and the amygdala. Moreover, we demonstrate that regions in the striatum, and regions associated with the default-mode network were differentially activated for emotional distraction compared to neutral distraction. Activation in a sub-set of these regions was related to individual differences in WM and recognition memory performance, thus likely contributing to performing the task at an optimal level. The present results provide initial insights into the behavioral and neural consequences of instructed attention and emotional distraction during WM encoding. © 2013.
Speaker emotion recognition: from classical classifiers to deep neural networks
NASA Astrophysics Data System (ADS)
Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri
2018-04-01
Speaker emotion recognition is considered among the most challenging tasks in recent years. In fact, automatic systems for security, medicine or education can be improved when considering the speech affective state. In this paper, a twofold approach for speech emotion classification is proposed. At the first side, a relevant set of features is adopted, and then at the second one, numerous supervised training techniques, involving classic methods as well as deep learning, are experimented. Experimental results indicate that deep architecture can improve classification performance on two affective databases, the Berlin Dataset of Emotional Speech and the SAVEE Dataset Surrey Audio-Visual Expressed Emotion.
Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E
2011-02-01
The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.
Attenuated sensitivity to the emotions of others by insular lesion
Terasawa, Yuri; Kurosaki, Yoshiko; Ibata, Yukio; Moriguchi, Yoshiya; Umeda, Satoshi
2015-01-01
The insular cortex has been considered to be the neural base of visceral sensation for many years. Previous studies in psychology and cognitive neuroscience have accumulated evidence indicating that interoception is an essential factor in the subjective feeling of emotion. Recent neuroimaging studies have demonstrated that anterior insular cortex activation is associated with accessing interoceptive information and underpinning the subjective experience of emotional state. Only a small number of studies have focused on the influence of insular damage on emotion processing and interoceptive awareness. Moreover, disparate hypotheses have been proposed for the alteration of emotion processing by insular lesions. Some studies show that insular lesions yield an inability for understanding and representing disgust exclusively, but other studies suggest that such lesions modulate arousal and valence judgments for both positive and negative emotions. In this study, we examined the alteration in emotion recognition in three right insular and adjacent area damaged cases with well-preserved higher cognitive function. Participants performed an experimental task using morphed photos that ranged between neutral and emotional facial expressions (i.e., anger, sadness, disgust, and happiness). Recognition rates of particular emotions were calculated to measure emotional sensitivity. In addition, they performed heartbeat perception task for measuring interoceptive accuracy. The cases identified emotions that have high arousal level (e.g., anger) as less aroused emotions (e.g., sadness) and a case showed remarkably low interoceptive accuracy. The current results show that insular lesions lead to attenuated emotional sensitivity across emotions, rather than category-specific impairments such as to disgust. Despite the small number of cases, our findings suggest that the insular cortex modulates recognition of emotional saliency and mediates interoceptive and emotional awareness. PMID:26388817
Functional differences among those high and low on a trait measure of psychopathy.
Gordon, Heather L; Baird, Abigail A; End, Alison
2004-10-01
It has been established that individuals who score high on measures of psychopathy demonstrate difficulty when performing tasks requiring the interpretation of other's emotional states. The aim of this study was to elucidate the relation of emotion and cognition to individual differences on a standard psychopathy personality inventory (PPI) among a nonpsychiatric population. Twenty participants completed the PPI. Following survey completion, a mean split of their scores on the emotional-interpersonal factor was performed, and participants were placed into a high or low group. Functional magnetic resonance imaging data were collected while participants performed a recognition task that required attention be given to either the affect or identity of target stimuli. No significant behavioral differences were found. In response to the affect recognition task, significant differences between high- and low-scoring subjects were observed in several subregions of the frontal cortex, as well as the amygdala. No significant differences were found between the groups in response to the identity recognition condition. Results indicate that participants scoring high on the PPI, although not behaviorally distinct, demonstrate a significantly different pattern of neural activity (as measured by blood oxygen level-dependent contrast)in response to tasks that require affective processing. The results suggest a unique neural signature associated with personality differences in a nonpsychiatric population.
The coupling of emotion and cognition in the eye: introducing the pupil old/new effect.
Võ, Melissa L-H; Jacobs, Arthur M; Kuchinke, Lars; Hofmann, Markus; Conrad, Markus; Schacht, Annekathrin; Hutzler, Florian
2008-01-01
The study presented here investigated the effects of emotional valence on the memory for words by assessing both memory performance and pupillary responses during a recognition memory task. Participants had to make speeded judgments on whether a word presented in the test phase of the experiment had already been presented ("old") or not ("new"). An emotion-induced recognition bias was observed: Words with emotional content not only produced a higher amount of hits, but also elicited more false alarms than neutral words. Further, we found a distinct pupil old/new effect characterized as an elevated pupillary response to hits as opposed to correct rejections. Interestingly, this pupil old/new effect was clearly diminished for emotional words. We therefore argue that the pupil old/new effect is not only able to mirror memory retrieval processes, but also reflects modulation by an emotion-induced recognition bias.
The integration of visual context information in facial emotion recognition in 5- to 15-year-olds.
Theurel, Anne; Witt, Arnaud; Malsert, Jennifer; Lejeune, Fleur; Fiorentini, Chiara; Barisnikov, Koviljka; Gentaz, Edouard
2016-10-01
The current study investigated the role of congruent visual context information in the recognition of facial emotional expression in 190 participants from 5 to 15years of age. Children performed a matching task that presented pictures with different facial emotional expressions (anger, disgust, happiness, fear, and sadness) in two conditions: with and without a visual context. The results showed that emotions presented with visual context information were recognized more accurately than those presented in the absence of visual context. The context effect remained steady with age but varied according to the emotion presented and the gender of participants. The findings demonstrated for the first time that children from the age of 5years are able to integrate facial expression and visual context information, and this integration improves facial emotion recognition. Copyright © 2016 Elsevier Inc. All rights reserved.
Development of emotional facial recognition in late childhood and adolescence.
Thomas, Laura A; De Bellis, Michael D; Graham, Reiko; LaBar, Kevin S
2007-09-01
The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents and adults on a two-alternative forced-choice discrimination task using morphed faces that varied in emotional content. Actors appeared to pose expressions that changed incrementally along three progressions: neutral-to-fear, neutral-to-anger, and fear-to-anger. Across all three morph types, adults displayed more sensitivity to subtle changes in emotional expression than children and adolescents. Fear morphs and fear-to-anger blends showed a linear developmental trajectory, whereas anger morphs showed a quadratic trend, increasing sharply from adolescents to adults. The results provide evidence for late developmental changes in emotional expression recognition with some specificity in the time course for distinct emotions.
Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques
2013-09-01
"Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. bottom-up and top-down) on emotional memory, we conducted two studies in healthy and traumatized adolescents, a period of life in which vulnerability to emotion is particularly high. Using negative and neutral images selected from the international affective picture system (IAPS), stimuli were divided into perceptual images (emotion generated by perceptual details) and conceptual images (emotion generated by the general meaning of the material). Both categories of stimuli were then used, along with neutral pictures, in a memory task with two phases (encoding and recognition). In both populations, we reported a differential effect of the emotional material on encoding and recognition. Negative perceptual scenes induced an attentional capture effect during encoding and enhanced the recollective distinctiveness. Conversely, the encoding of conceptual scenes was similar to neutral ones, but the conceptual relatedness induced false memories at retrieval. However, among individuals with PTSD, two subgroups of patients were identified. The first subgroup processed the scenes faster than controls, except for the perceptual scenes, and obtained similar performances to controls in the recognition task. The second subgroup group desmonstrated an attentional deficit in the encoding task with no benefit from the distinctiveness associated with negative perceptual scenes on memory performances. These findings provide a new perspective on how negative emotional information may have opposite influences on memory in normal and traumatized individuals. It also gives clues to understand how intrusive memories and overgeneralization takes place in PTSD. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Liao, Zongqing; Li, Yan; Su, Yanjie
2014-01-01
This study examined emotion understanding and reconciliation in 47 (24 girls) 4-6-year-old preschool children. Participants first completed emotion recognition tasks and then answered questions regarding reconciliation tendencies and affective perspective-taking in a series of overt and relational aggressive conflict scenarios. Children's teachers…
Altering sensorimotor feedback disrupts visual discrimination of facial expressions.
Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula
2016-08-01
Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.
Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese.
Stanley, Jennifer Tehan; Zhang, Xin; Fung, Helene H; Isaacowitz, Derek M
2013-02-01
We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye-tracking data suggest that, for some emotions, Americans attended more to the target faces, and they made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Cultural Differences in Gaze and Emotion Recognition: Americans Contrast More than Chinese
Tehan Stanley, Jennifer; Zhang, Xin; Fung, Helene H.; Isaacowitz, Derek M.
2014-01-01
We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye tracking data suggest that, for some emotions, Americans attended more to the target faces and made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PMID:22889414
Lau, Anna S; Fung, Joey; Wang, Shu-Wen; Kang, Sun-Mee
2009-01-01
Previous research has documented elevated levels of social anxiety in Asian American college students when compared with their European American peers. The authors hypothesized that higher symptoms among Asians could be explained by cultural differences in attunement to the emotional states of others. Socialization within interdependent cultures may cultivate concerns about accurately perceiving other's emotional responses, yet at the same time, norms governing emotional control may limit competencies in emotion recognition. A sample of 264 Asian American and European American college students completed measures of social anxiety, attunement concerns (shame socialization and loss of face), and attunement competencies (self-reported sensitivity and performance on emotion recognition tasks). Results confirmed that ethnic differences in social anxiety symptoms were mediated by differences in attunement concerns and competencies in emotion recognition. Asian American college students may find themselves in a double bind that leads to social unease because of a cultural emphasis on sensitivity to others' emotions in the midst of barriers to developing this attunement skill set.
Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J
2018-03-14
The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.
Ventura, Joseph; Wood, Rachel C.; Jimenez, Amy M.; Hellemann, Gerhard S.
2014-01-01
Background In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? Methods A meta-analysis of 102 studies (combined n = 4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Results Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r = .51). In addition, the relationship between FR and EP through voice prosody (r = .58) is as strong as the relationship between FR and EP based on facial stimuli (r = .53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality – facial stimuli and voice prosody. Discussion The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. PMID:24268469
Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S
2013-12-01
In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.
Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease
Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul
2016-01-01
According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393
Neuroticism and facial emotion recognition in healthy adults.
Andric, Sanja; Maric, Nadja P; Knezevic, Goran; Mihaljevic, Marina; Mirjanic, Tijana; Velthorst, Eva; van Os, Jim
2016-04-01
The aim of the present study was to examine whether healthy individuals with higher levels of neuroticism, a robust independent predictor of psychopathology, exhibit altered facial emotion recognition performance. Facial emotion recognition accuracy was investigated in 104 healthy adults using the Degraded Facial Affect Recognition Task (DFAR). Participants' degree of neuroticism was estimated using neuroticism scales extracted from the Eysenck Personality Questionnaire and the Revised NEO Personality Inventory. A significant negative correlation between the degree of neuroticism and the percentage of correct answers on DFAR was found only for happy facial expression (significant after applying Bonferroni correction). Altered sensitivity to the emotional context represents a useful and easy way to obtain cognitive phenotype that correlates strongly with inter-individual variations in neuroticism linked to stress vulnerability and subsequent psychopathology. Present findings could have implication in early intervention strategies and staging models in psychiatry. © 2015 Wiley Publishing Asia Pty Ltd.
The association between PTSD and facial affect recognition.
Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard
2018-05-05
The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Rapid communication: Global-local processing affects recognition of distractor emotional faces.
Srinivasan, Narayanan; Gupta, Rashmi
2011-03-01
Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.
Underconnectivity of the superior temporal sulcus predicts emotion recognition deficits in autism
Woolley, Daniel G.; Steyaert, Jean; Di Martino, Adriana; Swinnen, Stephan P.; Wenderoth, Nicole
2014-01-01
Neurodevelopmental disconnections have been assumed to cause behavioral alterations in autism spectrum disorders (ASDs). Here, we combined measurements of intrinsic functional connectivity (iFC) from resting-state functional magnetic resonance imaging (fMRI) with task-based fMRI to explore whether altered activity and/or iFC of the right posterior superior temporal sulcus (pSTS) mediates deficits in emotion recognition in ASD. Fifteen adults with ASD and 15 matched-controls underwent resting-state and task-based fMRI, during which participants discriminated emotional states from point light displays (PLDs). Intrinsic FC of the right pSTS was further examined using 584 (278 ASD/306 controls) resting-state data of the Autism Brain Imaging Data Exchange (ABIDE). Participants with ASD were less accurate than controls in recognizing emotional states from PLDs. Analyses revealed pronounced ASD-related reductions both in task-based activity and resting-state iFC of the right pSTS with fronto-parietal areas typically encompassing the action observation network (AON). Notably, pSTS-hypo-activity was related to pSTS-hypo-connectivity, and both measures were predictive of emotion recognition performance with each measure explaining a unique part of the variance. Analyses with the large independent ABIDE dataset replicated reductions in pSTS-iFC to fronto-parietal regions. These findings provide novel evidence that pSTS hypo-activity and hypo-connectivity with the fronto-parietal AON are linked to the social deficits characteristic of ASD. PMID:24078018
Lysaker, Paul H; Hasson-Ohayon, Ilanit; Kravetz, Shlomo; Kent, Jerillyn S; Roe, David
2013-04-30
Many with schizophrenia have been found to experience difficulties recognizing a range of their own mental states including memories and emotions. While there is some evidence that the self perception of empathy in schizophrenia is often at odds with objective observations, little is known about the correlates of rates of concordance between self and rater assessments of empathy for this group. To explore this issue we gathered self and rater assessments of empathy in addition to assessments of emotion recognition using the Bell Lysaker Emotion Recognition Task, insight using the Scale to Assess Unawareness of Mental Disorder, and symptoms using the Positive and Negative Syndrome Scale from 91 adults diagnosed with schizophrenia spectrum disorders. Results revealed that participants with better emotion recognition, better insight, fewer positive symptoms and fewer depressive symptoms produced self ratings of empathy which were more strongly correlated with assessments of empathy performed by raters than participants with greater deficits in these domains. Results suggest that deficits in emotion recognition along with poor insight and higher levels of positive and depressive symptoms may affect the degree of agreement between self and rater assessments of empathy in schizophrenia. Published by Elsevier Ireland Ltd.
Buchanan, Tony W; Bibas, David; Adolphs, Ralph
2010-05-14
How do we recognize emotions from other people? One possibility is that our own emotional experiences guide us in the online recognition of emotion in others. A distinct but related possibility is that emotion experience helps us to learn how to recognize emotions in childhood. We explored these ideas in a large sample of people (N = 4,608) ranging from 5 to over 50 years old. Participants were asked to rate the intensity of emotional experience in their own lives, as well as to perform a task of facial emotion recognition. Those who reported more intense experience of fear and happiness were significantly more accurate (closer to prototypical) in recognizing facial expressions of fear and happiness, respectively, and intense experience of fear was associated also with more accurate recognition of surprised and happy facial expressions. The associations held across all age groups. These results suggest that the intensity of one's own emotional experience of fear and happiness correlates with the ability to recognize these emotions in others, and demonstrate such an association as early as age 5.
Effects of delta-9-tetrahydrocannabinol on evaluation of emotional images
Ballard, Michael E; Bedi, Gillinder; de Wit, Harriet
2013-01-01
There is growing evidence that drugs of abuse alter processing of emotional information in ways that could be attractive to users. Our recent report that Δ9-tetrahydrocannabinol (THC) diminishes amygdalar activation in response to threat-related faces suggests that THC may modify evaluation of emotionally-salient, particularly negative or threatening, stimuli. In this study, we examined the effects of acute THC on evaluation of emotional images. Healthy volunteers received two doses of THC (7.5 and 15 mg; p.o.) and placebo across separate sessions before performing tasks assessing facial emotion recognition and emotional responses to pictures of emotional scenes. THC significantly impaired recognition of facial fear and anger, but it only marginally impaired recognition of sadness and happiness. The drug did not consistently affect ratings of emotional scenes. THC' effects on emotional evaluation were not clearly related to its mood-altering effects. These results support our previous work, and show that THC reduces perception of facial threat. Nevertheless, THC does not appear to positively bias evaluation of emotional stimuli in general PMID:22585232
Hindocha, Chandni; Freeman, Tom P; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K; Morgan, Celia J A; Curran, H Valerie
2015-03-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8mg), CBD (16mg), THC+CBD (8mg+16mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling 'stoned' was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being 'stoned'. CBD did not influence feelings of 'stoned'. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Hindocha, Chandni; Freeman, Tom P.; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K.; Morgan, Celia J.A.; Curran, H. Valerie
2015-01-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8 mg), CBD (16 mg), THC+CBD (8 mg+16 mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling ‘stoned’ was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being ‘stoned’. CBD did not influence feelings of ‘stoned’. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. PMID:25534187
Anticipation of Negative Pictures Enhances the P2 and P3 in Their Later Recognition
Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Jin, Hua
2015-01-01
Anticipation of emotional pictures has been found to be relevant to the encoding of the pictures as well as their later recognition performance. However, it is as yet unknown whether anticipation modulates neural activity in the later recognition of emotional pictures. To address this issue, participants in the present study were asked to view emotional (negative or neutral) pictures. The picture was preceded by a cue which indicated the emotional content of the picture in half of the trials (the anticipated condition) and without any cues in the other half (the unanticipated condition). Subsequently, participants had to perform an unexpected old/new recognition task in which old and novel pictures were presented without any preceding cues. Electroencephalography data was recorded during the recognition phase. Event-related potential results showed that for negative pictures, P2 and P3 amplitudes were larger in the anticipated as compared to the unanticipated condition; whereas this anticipation effect was not shown for neutral pictures. The present findings suggest that anticipation of negative pictures may enhance neural activity in their later recognition. PMID:26648860
Anticipation of Negative Pictures Enhances the P2 and P3 in Their Later Recognition.
Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Jin, Hua
2015-01-01
Anticipation of emotional pictures has been found to be relevant to the encoding of the pictures as well as their later recognition performance. However, it is as yet unknown whether anticipation modulates neural activity in the later recognition of emotional pictures. To address this issue, participants in the present study were asked to view emotional (negative or neutral) pictures. The picture was preceded by a cue which indicated the emotional content of the picture in half of the trials (the anticipated condition) and without any cues in the other half (the unanticipated condition). Subsequently, participants had to perform an unexpected old/new recognition task in which old and novel pictures were presented without any preceding cues. Electroencephalography data was recorded during the recognition phase. Event-related potential results showed that for negative pictures, P2 and P3 amplitudes were larger in the anticipated as compared to the unanticipated condition; whereas this anticipation effect was not shown for neutral pictures. The present findings suggest that anticipation of negative pictures may enhance neural activity in their later recognition.
Emotional prosody processing in autism spectrum disorder
Kliemann, Dorit; Dziobek, Isabel; Heekeren, Hauke R.
2017-01-01
Abstract Individuals with Autism Spectrum Disorder (ASD) are characterized by severe deficits in social communication, whereby the nature of their impairments in emotional prosody processing have yet to be specified. Here, we investigated emotional prosody processing in individuals with ASD and controls with novel, lifelike behavioral and neuroimaging paradigms. Compared to controls, individuals with ASD showed reduced emotional prosody recognition accuracy on a behavioral task. On the neural level, individuals with ASD displayed reduced activity of the STS, insula and amygdala for complex vs basic emotions compared to controls. Moreover, the coupling between the STS and amygdala for complex vs basic emotions was reduced in the ASD group. Finally, groups differed with respect to the relationship between brain activity and behavioral performance. Brain activity during emotional prosody processing was more strongly related to prosody recognition accuracy in ASD participants. In contrast, the coupling between STS and anterior cingulate cortex (ACC) activity predicted behavioral task performance more strongly in the control group. These results provide evidence for aberrant emotional prosody processing of individuals with ASD. They suggest that the differences in the relationship between the neural and behavioral level of individuals with ASD may account for their observed deficits in social communication. PMID:27531389
Recognition of Schematic Facial Displays of Emotion in Parents of Children with Autism
ERIC Educational Resources Information Center
Palermo, Mark T.; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria
2006-01-01
Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. "Autism fathers" performed worse than "autism mothers," who performed worse than controls in…
Perception and Lateralization of Spoken Emotion by Youths with High-Functioning Forms of Autism
ERIC Educational Resources Information Center
Baker, Kimberly F.; Montgomery, Allen A.; Abramson, Ruth
2010-01-01
The perception and the cerebral lateralization of spoken emotions were investigated in children and adolescents with high-functioning forms of autism (HFFA), and age-matched typically developing controls (TDC). A dichotic listening task using nonsense passages was used to investigate the recognition of four emotions: happiness, sadness, anger, and…
Mier, Daniela; Eisenacher, Sarah; Rausch, Franziska; Englisch, Susanne; Gerchen, Martin Fungisai; Zamoscik, Vera; Meyer-Lindenberg, Andreas; Zink, Mathias; Kirsch, Peter
2017-10-01
Schizophrenia is associated with significant impairments in social cognition. These impairments have been shown to go along with altered activation of the posterior superior temporal sulcus (pSTS). However, studies that investigate connectivity of pSTS during social cognition in schizophrenia are sparse. Twenty-two patients with schizophrenia and 22 matched healthy controls completed a social-cognitive task for functional magnetic resonance imaging that allows the investigation of affective Theory of Mind (ToM), emotion recognition and the processing of neutral facial expressions. Moreover, a resting-state measurement was taken. Patients with schizophrenia performed worse in the social-cognitive task (main effect of group). In addition, a group by social-cognitive processing interaction was revealed for activity, as well as for connectivity during the social-cognitive task, i.e., patients with schizophrenia showed hyperactivity of right pSTS during neutral face processing, but hypoactivity during emotion recognition and affective ToM. In addition, hypoconnectivity between right and left pSTS was revealed for affective ToM, but not for neutral face processing or emotion recognition. No group differences in connectivity from right to left pSTS occurred during resting state. This pattern of aberrant activity and connectivity of the right pSTS during social cognition might form the basis of false-positive perceptions of emotions and intentions and could contribute to the emergence and sustainment of delusions.
How Psychological Stress Affects Emotional Prosody.
Paulmann, Silke; Furnes, Desire; Bøkenes, Anne Ming; Cozzolino, Philip J
2016-01-01
We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naïve listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity.
How Psychological Stress Affects Emotional Prosody
Paulmann, Silke; Furnes, Desire; Bøkenes, Anne Ming; Cozzolino, Philip J.
2016-01-01
We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naïve listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity. PMID:27802287
Kreitewolf, Jens; Friederici, Angela D; von Kriegstein, Katharina
2014-11-15
Hemispheric specialization for linguistic prosody is a controversial issue. While it is commonly assumed that linguistic prosody and emotional prosody are preferentially processed in the right hemisphere, neuropsychological work directly comparing processes of linguistic prosody and emotional prosody suggests a predominant role of the left hemisphere for linguistic prosody processing. Here, we used two functional magnetic resonance imaging (fMRI) experiments to clarify the role of left and right hemispheres in the neural processing of linguistic prosody. In the first experiment, we sought to confirm previous findings showing that linguistic prosody processing compared to other speech-related processes predominantly involves the right hemisphere. Unlike previous studies, we controlled for stimulus influences by employing a prosody and speech task using the same speech material. The second experiment was designed to investigate whether a left-hemispheric involvement in linguistic prosody processing is specific to contrasts between linguistic prosody and emotional prosody or whether it also occurs when linguistic prosody is contrasted against other non-linguistic processes (i.e., speaker recognition). Prosody and speaker tasks were performed on the same stimulus material. In both experiments, linguistic prosody processing was associated with activity in temporal, frontal, parietal and cerebellar regions. Activation in temporo-frontal regions showed differential lateralization depending on whether the control task required recognition of speech or speaker: recognition of linguistic prosody predominantly involved right temporo-frontal areas when it was contrasted against speech recognition; when contrasted against speaker recognition, recognition of linguistic prosody predominantly involved left temporo-frontal areas. The results show that linguistic prosody processing involves functions of both hemispheres and suggest that recognition of linguistic prosody is based on an inter-hemispheric mechanism which exploits both a right-hemispheric sensitivity to pitch information and a left-hemispheric dominance in speech processing. Copyright © 2014 Elsevier Inc. All rights reserved.
Lax decision criteria lead to negativity bias: evidence from the emotional stroop task.
Liu, Guofang; Xin, Ziqiang; Lin, Chongde
2014-06-01
Negativity bias means that negative information is usually given more emphasis than comparable positive information. Under signal detection theory, recent research found that people more frequently and incorrectly identify negative task-related words as having been presented originally than positive words, even when they were not presented. That is, people have lax decision criteria for negative words. However, the response biases for task-unrelated negative words and for emotionally important words are still unclear. This study investigated response bias for these two kinds of words. Study 1 examined the response bias for task-unrelated negative words using an emotional Stroop task. Proportions of correct recognition to negative and positive words were assessed by non-parametric signal detection analysis. Participants have lower (i.e., more lax) decision criteria for task-unrelated negative words than for positive words. Study 2 supported and expanded this result by investigating participants' response bias for highly emotional words. Participants have lower decision criteria for highly emotional words than for less emotional words. Finally, possible evolutionary sources of the response bias were discussed.
Early effects of duloxetine on emotion recognition in healthy volunteers
Bamford, Susan; Penton-Voak, Ian; Pinkney, Verity; Baldwin, David S; Munafò, Marcus R; Garner, Matthew
2015-01-01
The serotonin-noradrenaline reuptake inhibitor (SNRI) duloxetine is an effective treatment for major depression and generalised anxiety disorder. Neuropsychological models of antidepressant drug action suggest therapeutic effects might be mediated by the early correction of maladaptive biases in emotion processing, including the recognition of emotional expressions. Sub-chronic administration of duloxetine (for two weeks) produces adaptive changes in neural circuitry implicated in emotion processing; however, its effects on emotional expression recognition are unknown. Forty healthy participants were randomised to receive either 14 days of duloxetine (60 mg/day, titrated from 30 mg after three days) or matched placebo (with sham titration) in a double-blind, between-groups, repeated-measures design. On day 0 and day 14 participants completed a computerised emotional expression recognition task that measured sensitivity to the six primary emotions. Thirty-eight participants (19 per group) completed their course of tablets and were included in the analysis. Results provide evidence that duloxetine, compared to placebo, may reduce the accurate recognition of sadness. Drug effects were driven by changes in participants’ ability to correctly detect subtle expressions of sadness, with greater change observed in the placebo relative to the duloxetine group. These effects occurred in the absence of changes in mood. Our preliminary findings require replication, but complement recent evidence that sadness recognition is a therapeutic target in major depression, and a mechanism through which SNRIs could resolve negative biases in emotion processing to achieve therapeutic effects. PMID:25759400
Galdos, Mariana; Simons, Claudia J P; Wichers, Marieke; Fernandez-Rivas, Aranzazu; Martinez-Azumendi, Oscar; Lataster, Tineke; Amer, Guillermo; Myin-Germeys, Inez; Gonzalez-Torres, Miguel Angel; van Os, Jim
2011-10-01
Neurocognitive impairments observed in psychotic disorder may impact on emotion recognition and theory of mind, resulting in altered understanding of the social world. Early intervention efforts would be served by further elucidation of this mechanism. Patients with a psychotic disorder (n=30) and a reference control group (n=310) were asked to offer emotional appraisals of images of social situations (EASS task). The degree to which case-control differences in appraisals were mediated by neurocognitive alterations was analyzed. The EASS task displayed convergent and discriminant validity. Compared to controls, patients displayed blunted emotional appraisal of social situations (B=0.52, 95% CI: 0.30, 0.74, P<0.001; adjusted for age, sex and number of years of education: B=0.44, 95% CI: 0.20, 0.68, P<0.001), a difference of 0.88 (adjusted: 0.75) standard deviation. After adjustment for neurocognitive variables, the case-control difference was reduced by nearly 75% and was non-significant (B=0.12, 95% CI: -0.14, 0.39, P=0.37). Neurocognitive impairments observed in patients with psychotic disorder may underlie misrepresentation of the social world, mediated by altered emotion recognition. A task assessing the social impact of cognitive alterations in clinical practice may be useful in detecting key alterations very early in the course of psychotic illness.
The effects of glucose dose and dual-task performance on memory for emotional material.
Brandt, Karen R; Sünram-Lea, Sandra I; Jenkinson, Paul M; Jones, Emma
2010-07-29
Whilst previous research has shown that glucose administration can boost memory performance, research investigating the effects of glucose on memory for emotional material has produced mixed findings. Whereas some research has shown that glucose impairs memory for emotional material, other research has shown that glucose has no effect on emotional items. The aim of the present research was therefore to provide further investigation of the role of glucose on the recognition of words with emotional valence by exploring effects of dose and dual-task performance, both of which affect glucose facilitation effects. The results replicated past research in showing that glucose administration, regardless of dose or dual-task conditions, did not affect the memorial advantage enjoyed by emotional material. This therefore suggests an independent relationship between blood glucose levels and memory for emotional material. Copyright 2010 Elsevier B.V. All rights reserved.
Social Cognition Psychometric Evaluation: Results of the Initial Psychometric Study
Pinkham, Amy E.; Penn, David L.; Green, Michael F.; Harvey, Philip D.
2016-01-01
Measurement of social cognition in treatment trials remains problematic due to poor and limited psychometric data for many tasks. As part of the Social Cognition Psychometric Evaluation (SCOPE) study, the psychometric properties of 8 tasks were assessed. One hundred and seventy-nine stable outpatients with schizophrenia and 104 healthy controls completed the battery at baseline and a 2–4-week retest period at 2 sites. Tasks included the Ambiguous Intentions Hostility Questionnaire (AIHQ), Bell Lysaker Emotion Recognition Task (BLERT), Penn Emotion Recognition Task (ER-40), Relationships Across Domains (RAD), Reading the Mind in the Eyes Task (Eyes), The Awareness of Social Inferences Test (TASIT), Hinting Task, and Trustworthiness Task. Tasks were evaluated on: (i) test-retest reliability, (ii) utility as a repeated measure, (iii) relationship to functional outcome, (iv) practicality and tolerability, (v) sensitivity to group differences, and (vi) internal consistency. The BLERT and Hinting task showed the strongest psychometric properties across all evaluation criteria and are recommended for use in clinical trials. The ER-40, Eyes Task, and TASIT showed somewhat weaker psychometric properties and require further study. The AIHQ, RAD, and Trustworthiness Task showed poorer psychometric properties that suggest caution for their use in clinical trials. PMID:25943125
Effects of exposure to facial expression variation in face learning and recognition.
Liu, Chang Hong; Chen, Wenfeng; Ward, James
2015-11-01
Facial expression is a major source of image variation in face images. Linking numerous expressions to the same face can be a huge challenge for face learning and recognition. It remains largely unknown what level of exposure to this image variation is critical for expression-invariant face recognition. We examined this issue in a recognition memory task, where the number of facial expressions of each face being exposed during a training session was manipulated. Faces were either trained with multiple expressions or a single expression, and they were later tested in either the same or different expressions. We found that recognition performance after learning three emotional expressions had no improvement over learning a single emotional expression (Experiments 1 and 2). However, learning three emotional expressions improved recognition compared to learning a single neutral expression (Experiment 3). These findings reveal both the limitation and the benefit of multiple exposures to variations of emotional expression in achieving expression-invariant face recognition. The transfer of expression training to a new type of expression is likely to depend on a relatively extensive level of training and a certain degree of variation across the types of expressions.
Intranasal oxytocin improves emotion recognition for youth with autism spectrum disorders.
Guastella, Adam J; Einfeld, Stewart L; Gray, Kylie M; Rinehart, Nicole J; Tonge, Bruce J; Lambert, Timothy J; Hickie, Ian B
2010-04-01
A diagnostic hallmark of autism spectrum disorders is a qualitative impairment in social communication and interaction. Deficits in the ability to recognize the emotions of others are believed to contribute to this. There is currently no effective treatment for these problems. In a double-blind, randomized, placebo-controlled, crossover design, we administered oxytocin nasal spray (18 or 24 IU) or a placebo to 16 male youth aged 12 to 19 who were diagnosed with Autistic or Asperger's Disorder. Participants then completed the Reading the Mind in the Eyes Task, a widely used and reliable test of emotion recognition. In comparison with placebo, oxytocin administration improved performance on the Reading the Mind in the Eyes Task. This effect was also shown when analysis was restricted to the younger participants aged 12 to 15 who received the lower dose. This study provides the first evidence that oxytocin nasal spray improves emotion recognition in young people diagnosed with autism spectrum disorders. Findings suggest the potential of earlier intervention and further evaluation of oxytocin nasal spray as a treatment to improve social communication and interaction in young people with autism spectrum disorders. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Younger and Older Users’ Recognition of Virtual Agent Facial Expressions
Beer, Jenay M.; Smarr, Cory-Ann; Fisk, Arthur D.; Rogers, Wendy A.
2015-01-01
As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults’ ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition. PMID:25705105
Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S
2018-02-01
The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bernaerts, Sylvie; Berra, Emmely; Wenderoth, Nicole; Alaerts, Kaat
2016-10-01
The neuropeptide 'oxytocin' (OT) is known to play a pivotal role in a variety of complex social behaviors by promoting a prosocial attitude and interpersonal bonding. One mechanism by which OT is hypothesized to promote prosocial behavior is by enhancing the processing of socially relevant information from the environment. With the present study, we explored to what extent OT can alter the 'reading' of emotional body language as presented by impoverished biological motion point light displays (PLDs). To do so, a double-blind between-subjects randomized placebo-controlled trial was conducted, assessing performance on a bodily emotion recognition task in healthy adult males before and after a single-dose of intranasal OT (24 IU). Overall, a single-dose of OT administration had a significant effect of medium size on emotion recognition from body language. OT-induced improvements in emotion recognition were not differentially modulated by the emotional valence of the presented stimuli (positive versus negative) and also, the overall tendency to label an observed emotional state as 'happy' (positive) or 'angry' (negative) was not modified by the administration of OT. Albeit moderate, the present findings of OT-induced improvements in bodily emotion recognition from whole-body PLD provide further support for a link between OT and the processing of socio-communicative cues originating from the body of others. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Processing of emotional reactivity and emotional memory over sleep
Baran, Bengi; Pace-Schott, Edward F.; Ericson, Callie; Spencer, Rebecca M. C.
2012-01-01
Sleep enhances memories, particularly emotional memories. As such, it has been suggested that sleep deprivation may reduce post-traumatic stress disorder. This presumes that emotional memory consolidation is paralleled by a reduction in emotional reactivity, an association that has not yet been examined. In the present experiment, we utilized an incidental memory task in humans and obtained valence and arousal ratings during two sessions separated either by 12 hours of daytime wake or 12 hours including overnight sleep. Recognition accuracy was greater following sleep relative to wake for both negative and neutral pictures. While emotional reactivity to negative pictures was greatly reduced over wake, the negative emotional response was relatively preserved over sleep. Moreover, protection of emotional reactivity was associated with greater time in REM sleep. Recognition accuracy, however, was not associated with REM. Thus, we provide the first evidence that sleep enhances emotional memory while preserving emotional reactivity. PMID:22262901
Feasibility Testing of a Wearable Behavioral Aid for Social Learning in Children with Autism.
Daniels, Jena; Haber, Nick; Voss, Catalin; Schwartz, Jessey; Tamura, Serena; Fazel, Azar; Kline, Aaron; Washington, Peter; Phillips, Jennifer; Winograd, Terry; Feinstein, Carl; Wall, Dennis P
2018-01-01
Recent advances in computer vision and wearable technology have created an opportunity to introduce mobile therapy systems for autism spectrum disorders (ASD) that can respond to the increasing demand for therapeutic interventions; however, feasibility questions must be answered first. We studied the feasibility of a prototype therapeutic tool for children with ASD using Google Glass, examining whether children with ASD would wear such a device, if providing the emotion classification will improve emotion recognition, and how emotion recognition differs between ASD participants and neurotypical controls (NC). We ran a controlled laboratory experiment with 43 children: 23 with ASD and 20 NC. Children identified static facial images on a computer screen with one of 7 emotions in 3 successive batches: the first with no information about emotion provided to the child, the second with the correct classification from the Glass labeling the emotion, and the third again without emotion information. We then trained a logistic regression classifier on the emotion confusion matrices generated by the two information-free batches to predict ASD versus NC. All 43 children were comfortable wearing the Glass. ASD and NC participants who completed the computer task with Glass providing audible emotion labeling ( n = 33) showed increased accuracies in emotion labeling, and the logistic regression classifier achieved an accuracy of 72.7%. Further analysis suggests that the ability to recognize surprise, fear, and neutrality may distinguish ASD cases from NC. This feasibility study supports the utility of a wearable device for social affective learning in ASD children and demonstrates subtle differences in how ASD and NC children perform on an emotion recognition task. Schattauer GmbH Stuttgart.
ERIC Educational Resources Information Center
Li, Ming
2013-01-01
The goal of this work is to enhance the robustness and efficiency of the multimodal human states recognition task. Human states recognition can be considered as a joint term for identifying/verifing various kinds of human related states, such as biometric identity, language spoken, age, gender, emotion, intoxication level, physical activity, vocal…
Oldehinkel, Albertine J; Hartman, Catharina A; Van Oort, Floor V A; Nederhof, Esther
2015-01-01
Background Some adolescents function poorly in apparently benign environments, while others thrive despite hassles and difficulties. The aim of this study was to examine if adolescents with specialized skills in the recognition of either positive or negative emotions have a context-dependent risk of developing an anxiety or depressive disorder during adolescence, depending on exposure to positive or harsh parenting. Methods Data came from a large prospective Dutch population study (N = 1539). At age 11, perceived parental rejection and emotional warmth were measured by questionnaire, and emotion recognition skills by means of a reaction-time task. Lifetime diagnoses of anxiety and depressive disorders were assessed at about age 19, using a standardized diagnostic interview. Results Adolescents who were specialized in the recognition of positive emotions had a relatively high probability to develop an anxiety disorder when exposed to parental rejection (Bspecialization*rejection = 0.23, P < 0.01) and a relatively low probability in response to parental emotional warmth (Bspecialization*warmth = −0.24, P = 0.01), while the opposite pattern was found for specialists in negative emotions. The effect of parental emotional warmth on depression onset was likewise modified by emotion recognition specialization (B = −0.13, P = 0.03), but the effect of parental rejection was not (B = 0.02, P = 0.72). In general, the relative advantage of specialists in negative emotions was restricted to fairly uncommon negative conditions. Conclusions Our results suggest that there is no unequivocal relation between parenting behaviors and the probability to develop an anxiety or depressive disorder in adolescence, and that emotion recognition specialization may be a promising way to distinguish between various types of context-dependent reaction patterns. PMID:25642389
Oldehinkel, Albertine J; Hartman, Catharina A; Van Oort, Floor V A; Nederhof, Esther
2015-02-01
Some adolescents function poorly in apparently benign environments, while others thrive despite hassles and difficulties. The aim of this study was to examine if adolescents with specialized skills in the recognition of either positive or negative emotions have a context-dependent risk of developing an anxiety or depressive disorder during adolescence, depending on exposure to positive or harsh parenting. Data came from a large prospective Dutch population study (N = 1539). At age 11, perceived parental rejection and emotional warmth were measured by questionnaire, and emotion recognition skills by means of a reaction-time task. Lifetime diagnoses of anxiety and depressive disorders were assessed at about age 19, using a standardized diagnostic interview. Adolescents who were specialized in the recognition of positive emotions had a relatively high probability to develop an anxiety disorder when exposed to parental rejection (Bspecialization*rejection = 0.23, P < 0.01) and a relatively low probability in response to parental emotional warmth (Bspecialization*warmth = -0.24, P = 0.01), while the opposite pattern was found for specialists in negative emotions. The effect of parental emotional warmth on depression onset was likewise modified by emotion recognition specialization (B = -0.13, P = 0.03), but the effect of parental rejection was not (B = 0.02, P = 0.72). In general, the relative advantage of specialists in negative emotions was restricted to fairly uncommon negative conditions. Our results suggest that there is no unequivocal relation between parenting behaviors and the probability to develop an anxiety or depressive disorder in adolescence, and that emotion recognition specialization may be a promising way to distinguish between various types of context-dependent reaction patterns.
Nonlinguistic vocalizations from online amateur videos for emotion research: A validated corpus.
Anikin, Andrey; Persson, Tomas
2017-04-01
This study introduces a corpus of 260 naturalistic human nonlinguistic vocalizations representing nine emotions: amusement, anger, disgust, effort, fear, joy, pain, pleasure, and sadness. The recognition accuracy in a rating task varied greatly per emotion, from <40% for joy and pain, to >70% for amusement, pleasure, fear, and sadness. In contrast, the raters' linguistic-cultural group had no effect on recognition accuracy: The predominantly English-language corpus was classified with similar accuracies by participants from Brazil, Russia, Sweden, and the UK/USA. Supervised random forest models classified the sounds as accurately as the human raters. The best acoustic predictors of emotion were pitch, harmonicity, and the spacing and regularity of syllables. This corpus of ecologically valid emotional vocalizations can be filtered to include only sounds with high recognition rates, in order to study reactions to emotional stimuli of known perceptual types (reception side), or can be used in its entirety to study the association between affective states and vocal expressions (production side).
Kessels, Roy P C; Montagne, Barbara; Hendriks, Angelique W; Perrett, David I; de Haan, Edward H F
2014-03-01
The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8-75. In children aged 8-17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age-related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression-based approach was adopted to present age- and education- or IQ-adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception. © 2013 The British Psychological Society.
Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.
Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun
2016-07-01
The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.
Decoding Actions and Emotions in Deaf Children: Evidence from a Biological Motion Task
ERIC Educational Resources Information Center
Ludlow, Amanda Katherine; Heaton, Pamela; Deruelle, Christine
2013-01-01
This study aimed to explore the recognition of emotional and non-emotional biological movements in children with severe and profound deafness. Twenty-four deaf children, together with 24 control children matched on mental age and 24 control children matched on chronological age, were asked to identify a person's actions, subjective states,…
ERIC Educational Resources Information Center
Tsang, Vicky
2018-01-01
The eye-tracking experiment was carried out to assess fixation duration and scan paths that individuals with and without high-functioning autism spectrum disorders employed when identifying simple and complex emotions. Participants viewed human photos of facial expressions and decided on the identification of emotion, the negative-positive emotion…
Actively Paranoid Patients with Schizophrenia Over Attribute Anger to Neutral Faces
Pinkham, Amy E.; Brensinger, Colleen; Kohler, Christian; Gur, Raquel E.; Gur, Ruben C.
2010-01-01
Previous investigations of the influence of paranoia on facial affect recognition in schizophrenia have been inconclusive as some studies demonstrate better performance for paranoid relative to non-paranoid patients and others show that paranoid patients display greater impairments. These studies have been limited by small sample sizes and inconsistencies in the criteria used to define groups. Here, we utilized an established emotion recognition task and a large sample to examine differential performance in emotion recognition ability between patients who were actively paranoid (AP) and those who were not actively paranoid (NAP). Accuracy and error patterns on the Penn Emotion Recognition test (ER40) were examined in 132 patients (64 NAP and 68 AP). Groups were defined based on the presence of paranoid ideation at the time of testing rather than diagnostic subtype. AP and NAP patients did not differ in overall task accuracy; however, an emotion by group interaction indicated that AP patients were significantly worse than NAP patients at correctly labeling neutral faces. A comparison of error patterns on neutral stimuli revealed that the groups differed only in misattributions of anger expressions, with AP patients being significantly more likely to misidentify a neutral expression as angry. The present findings suggest that paranoia is associated with a tendency to over attribute threat to ambiguous stimuli and also lend support to emerging hypotheses of amygdala hyperactivation as a potential neural mechanism for paranoid ideation. PMID:21112186
Positive and negative emotional contexts unevenly predict episodic memory.
Martínez-Galindo, Joyce Graciela; Cansino, Selene
2015-09-15
The aim of this study was to investigate whether the recognition of faces with neutral expressions differs when they are encoded under different emotional contexts (positive, negative or non-emotional). The effects of the emotional valence context on the subsequent memory effect (SME) and the autonomic responses were also examined. Twenty-eight participants performed a betting-game task in which the faces of their virtual opponents were presented in each trial. The probability of winning or losing was manipulated to generate positive or negative contexts, respectively. Additionally, the participants performed the same task without betting as a non-emotional condition. After the encoding phase, an old/new paradigm was performed for the faces of the virtual opponents. The recognition was superior for the faces encoded in the positive contexts than for the faces encoded in the non-emotional contexts. The skin conductance response amplitude was equivalent for both of the emotional contexts. The N170 and P300 components at occipital sites and the frontal slow wave manifested SMEs that were modulated by positive contexts; neither negative nor non-emotional contexts influenced these effects. The behavioral and neurophysiological data demonstrated that positive contexts are stronger predictors of episodic memory than negative or non-emotional contexts. Copyright © 2015 Elsevier B.V. All rights reserved.
Saccadic movement deficiencies in adults with ADHD tendencies.
Lee, Yun-Jeong; Lee, Sangil; Chang, Munseon; Kwak, Ho-Wan
2015-12-01
The goal of the present study was to explore deficits in gaze detection and emotional value judgment during a saccadic eye movement task in adults with attention deficit/hyperactivity disorder (ADHD) tendencies. Thirty-two participants, consisting of 16 ADHD tendencies and 16 controls, were recruited from a pool of 243 university students. Among the many problems in adults with ADHDs, our research focused on the deficits in the processing of nonverbal cues, such as gaze direction and the emotional value of others' faces. In Experiment 1, a cue display containing a face with emotional value and gaze direction was followed by a target display containing two faces located on the left and right side of the display. The participant's task was to make an anti-saccade opposite to the gaze direction if the cue face was not emotionally neutral. ADHD tendencies showed more overall errors than controls in making anti-saccades. Based on the hypothesis that the exposure duration of the cue display in Experiment 1 may have been too long, we presented the cue and target display simultaneously to prevent participants from preparing saccades in advance. Participants in Experiment 2 were asked to make either a pro-saccade or an anti-saccade depending on the emotional value of the central cue face. Interestingly, significant group differences were observed for errors of omission and commission. In addition, a significant three-way interaction among groups, cue emotion, and target gaze direction suggests that the emotional recognition and gaze control systems might somehow be interconnected. The result also shows that ADHDs are more easily distracted by a task-irrelevant gaze direction. Taken together, these results suggest that tasks requiring both response inhibition (anti-saccade) and gaze-emotion recognition might be useful in developing a diagnostic test for discriminating adults with ADHDs from healthy adults.
Sharp, Carla; Vanwoerden, Salome; Van Baardewijk, Y; Tackett, J L; Stegge, H
2015-06-01
The aims of the current study were to show that the affective component of psychopathy (callous-unemotional traits) is related to deficits in recognizing emotions over and above other psychopathy dimensions and to show that this relationship is driven by a specific deficit in recognizing complex emotions more so than basic emotions. The authors administered the Child Eyes Test to assess emotion recognition in a community sample of preadolescent children between the ages of 10 and 12 (N = 417; 53.6% boys). The task required children to identify a broad array of emotions from photographic stimuli depicting the eye region of the face. Stimuli were then divided into complex or basic emotions. Results demonstrated a unique association between callous-unemotional traits and complex emotions, with weaker associations with basic emotion recognition, over and above other dimensions of psychopathy.
Violent video game players and non-players differ on facial emotion recognition.
Diaz, Ruth L; Wong, Ulric; Hodgins, David C; Chiu, Carina G; Goghari, Vina M
2016-01-01
Violent video game playing has been associated with both positive and negative effects on cognition. We examined whether playing two or more hours of violent video games a day, compared to not playing video games, was associated with a different pattern of recognition of five facial emotions, while controlling for general perceptual and cognitive differences that might also occur. Undergraduate students were categorized as violent video game players (n = 83) or non-gamers (n = 69) and completed a facial recognition task, consisting of an emotion recognition condition and a control condition of gender recognition. Additionally, participants completed questionnaires assessing their video game and media consumption, aggression, and mood. Violent video game players recognized fearful faces both more accurately and quickly and disgusted faces less accurately than non-gamers. Desensitization to violence, constant exposure to fear and anxiety during game playing, and the habituation to unpleasant stimuli, are possible mechanisms that could explain these results. Future research should evaluate the effects of violent video game playing on emotion processing and social cognition more broadly. © 2015 Wiley Periodicals, Inc.
Effects of the potential lithium-mimetic, ebselen, on impulsivity and emotional processing.
Masaki, Charles; Sharpley, Ann L; Cooper, Charlotte M; Godlewska, Beata R; Singh, Nisha; Vasudevan, Sridhar R; Harmer, Catherine J; Churchill, Grant C; Sharp, Trevor; Rogers, Robert D; Cowen, Philip J
2016-07-01
Lithium remains the most effective treatment for bipolar disorder and also has important effects to lower suicidal behaviour, a property that may be linked to its ability to diminish impulsive, aggressive behaviour. The antioxidant drug, ebselen, has been proposed as a possible lithium-mimetic based on its ability in animals to inhibit inositol monophosphatase (IMPase), an action which it shares with lithium. The aim of the study was to determine whether treatment with ebselen altered emotional processing and diminished measures of risk-taking behaviour. We studied 20 healthy participants who were tested on two occasions receiving either ebselen (3600 mg over 24 h) or identical placebo in a double-blind, randomized, cross-over design. Three hours after the final dose of ebselen/placebo, participants completed the Cambridge Gambling Task (CGT) and a task that required the detection of emotional facial expressions (facial emotion recognition task (FERT)). On the CGT, relative to placebo, ebselen reduced delay aversion while on the FERT, it increased the recognition of positive vs negative facial expressions. The study suggests that at the dosage used, ebselen can decrease impulsivity and produce a positive bias in emotional processing. These findings have implications for the possible use of ebselen in the disorders characterized by impulsive behaviour and dysphoric mood.
Windmann, Sabine; Hill, Holger
2014-10-01
Performance on tasks requiring discrimination of at least two stimuli can be viewed either from an objective perspective (referring to actual stimulus differences), or from a subjective perspective (corresponding to participant's responses). Using event-related potentials recorded during an old/new recognition memory test involving emotionally laden and neutral words studied either blockwise or randomly intermixed, we show here how the objective perspective (old versus new items) yields late effects of blockwise emotional item presentation at parietal sites that the subjective perspective fails to find, whereas the subjective perspective ("old" versus "new" responses) is more sensitive to early effects of emotion at anterior sites than the objective perspective. Our results demonstrate the potential advantage of dissociating the subjective and the objective perspective onto task performance (in addition to analyzing trials with correct responses), especially for investigations of illusions and information processing biases, in behavioral and cognitive neuroscience studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Recognition of facial and musical emotions in Parkinson's disease.
Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N
2013-03-01
Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.
Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M
2014-12-01
Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.
Wang, Pengyun; Li, Juan; Li, Huijie; Li, Bing; Jiang, Yang; Bao, Feng; Zhang, Shouzi
2013-11-01
This study investigated whether the observed absence of emotional memory enhancement in recognition tasks in patients with amnestic mild cognitive impairment (aMCI) could be related to their greater proportion of familiarity-based responses for all stimuli, and whether recognition tests with emotional items had better discriminative power for aMCI patients than those with neutral items. In total, 31 aMCI patients and 30 healthy older adults participated in a recognition test followed by remember/know judgments. Positive, neutral, and negative faces were used as stimuli. For overall recognition performance, emotional memory enhancement was found only in healthy controls; they remembered more negative and positive stimuli than neutral ones. For "remember" responses, we found equivalent emotional memory enhancement in both groups, though a greater proportion of "remember" responses was observed in normal controls. For "know" responses, aMCI patients presented a larger proportion than normal controls did, and their "know" responses were not affected by emotion. A negative correlation was found between emotional enhancement effect and the memory performance related to "know" responses. In addition, receiver operating characteristic curve analysis revealed higher diagnostic accuracy for recognition test with emotional stimuli than with neutral stimuli. The present results implied that the absence of the emotional memory enhancement effect in aMCI patients might be related to their tendency to rely more on familiarity-based "know" responses for all stimuli. Furthermore, recognition memory tests using emotional stimuli may be better able than neutral stimuli to differentiate people with aMCI from cognitively normal older adults. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Facial Affect Recognition in Violent and Nonviolent Antisocial Behavior Subtypes.
Schönenberg, Michael; Mayer, Sarah Verena; Christian, Sandra; Louis, Katharina; Jusyte, Aiste
2016-10-01
Prior studies provide evidence for impaired recognition of distress cues in individuals exhibiting antisocial behavior. However, it remains unclear whether this deficit is generally associated with antisociality or may be specific to violent behavior only. To examine whether there are meaningful differences between the two behavioral dimensions rule-breaking and aggression, violent and nonviolent incarcerated offenders as well as control participants were presented with an animated face recognition task in which a video sequence of a neutral face changed into an expression of one of the six basic emotions. The participants were instructed to press a button as soon as they were able to identify the emotional expression, allowing for an assessment of the perceived emotion onset. Both aggressive and nonaggressive offenders demonstrated a delayed perception of primarily fearful facial cues as compared to controls. These results suggest the importance of targeting impaired emotional processing in both types of antisocial behavior.
ERIC Educational Resources Information Center
Drus, Marina; Kozbelt, Aaron; Hughes, Robert R.
2014-01-01
To what extent do more creative people process emotional information differently than less creative people? This study examined the role of emotion processing in creativity and its implications for the creativity-psychopathology association. A total of 117 participants performed a memory recognition task for negative, positive, and neutral words;…
Leal, Stephanie L; Noche, Jessica A; Murray, Elizabeth A; Yassa, Michael A
2017-01-01
While aging is generally associated with episodic memory decline, not all older adults exhibit memory loss. Furthermore, emotional memories are not subject to the same extent of forgetting and appear preserved in aging. We conducted high-resolution fMRI during a task involving pattern separation of emotional information in older adults with and without age-related memory impairment (characterized by performance on a word-list learning task: low performers: LP vs. high performers: HP). We found signals consistent with emotional pattern separation in hippocampal dentate (DG)/CA3 in HP but not in LP individuals, suggesting a deficit in emotional pattern separation. During false recognition, we found increased DG/CA3 activity in LP individuals, suggesting that hyperactivity may be associated with overgeneralization. We additionally observed a selective deficit in basolateral amygdala-lateral entorhinal cortex-DG/CA3 functional connectivity in LP individuals during pattern separation of negative information. During negative false recognition, LP individuals showed increased medial temporal lobe functional connectivity, consistent with overgeneralization. Overall, these results suggest a novel mechanistic account of individual differences in emotional memory alterations exhibited in aging. Copyright © 2016 Elsevier Inc. All rights reserved.
Leal, Stephanie L.; Noche, Jessica A.; Murray, Elizabeth A.; Yassa, Michael A.
2018-01-01
While aging is generally associated with episodic memory decline, not all older adults exhibit memory loss. Furthermore, emotional memories are not subject to the same extent of forgetting and appear preserved in aging. We conducted high-resolution fMRI during a task involving pattern separation of emotional information in older adults with and without age-related memory impairment (characterized by performance on a word-list learning task: low performers: LP vs. high performers: HP). We found signals consistent with emotional pattern separation in hippocampal dentate (DG)/CA3 in HP but not in LP individuals, suggesting a deficit in emotional pattern separation. During false recognition, we found increased DG/CA3 activity in LP individuals, suggesting that hyperactivity may be associated with overgeneralization. We additionally observed a selective deficit in basolateral amygdala—lateral entorhinal cortex—DG/CA3 functional connectivity in LP individuals during pattern separation of negative information. During negative false recognition, LP individuals showed increased medial temporal lobe functional connectivity, consistent with overgeneralization. Overall, these results suggest a novel mechanistic account of individual differences in emotional memory alterations exhibited in aging. PMID:27723500
Kerstner, Tobias; Witthöft, Michael; Mier, Daniela; Diener, Carsten; Rist, Fred; Bailer, Josef
2015-06-01
To examine whether a 2-week attribution modification training (AMT) changes symptom severity, emotional evaluation of health-threatening stimuli, and cognitive biases in pathological health anxiety. We randomized 85 patients with pathological health anxiety into an electronic diary-based AMT group (AMTG; n = 42) and a control group without AMT (CG; n = 43). Self-report symptom measures, emotional evaluation, attentional bias, and memory bias toward symptom and illness words were assessed with an emotional Stroop task, a recognition task, and an emotional rating task for valence and arousal. After the 2-week period, the AMTG compared with the CG reported lower symptoms of pathological health anxiety, F(1, 82) = 10.94, p < .01, η2p = .12, rated symptom, F(1, 82) = 5.56, p = .02, η2p = .06, and illness words, F(1, 82) = 4.13, p = .045, η2p = .05, as less arousing, and revealed a smaller memory response bias toward symptom words in the recognition task F(1, 82) = 12.32, p < .01, η2p = .13. However, no specific AMT effect was observed for the attentional bias. The results support the efficacy of a comparatively short cognitive intervention in pathological health anxiety as a possible add-on intervention to existing treatment approaches to reduce symptom severity, as well as abnormalities in health-related emotional evaluation and memory processes. (c) 2015 APA, all rights reserved).
The processing of emotional prosody and semantics in schizophrenia: relationship to gender and IQ.
Scholten, M R M; Aleman, A; Kahn, R S
2008-06-01
Female patients with schizophrenia are less impaired in social life than male patients. Because social impairment in schizophrenia has been found to be associated with deficits in emotion recognition, we examined whether the female advantage in processing emotional prosody and semantics is preserved in schizophrenia. Forty-eight patients (25 males, 23 females) and 46 controls (23 males, 23 females) were assessed using an emotional language task (in which healthy women generally outperform healthy men), consisting of 96 sentences in four conditions: (1) neutral-content/emotional-tone (happy, sad, angry or anxious); (2) neutral-tone/emotional-content; (3) emotional-tone/incongruous emotional-content; and (4) emotional-content/incongruous emotional-tone. Participants had to ignore the emotional-content in the third condition and the emotional-tone in the fourth condition. In addition, participants were assessed with a visuospatial task (in which healthy men typically excel). Correlation coefficients were computed for associations between emotional language data, visuospatial data, IQ measures and patient variables. Overall, on the emotional language task, patients made more errors than control subjects, and women outperformed men across diagnostic groups. Controlling for IQ revealed a significant effect on task performance in all groups, especially in the incongruent tasks. On the rotation task, healthy men outperformed healthy women, but male patients, female patients and female controls obtained similar scores. The advantage in emotional prosodic and semantic processing in healthy women is preserved in schizophrenia, whereas the male advantage in visuospatial processing is lost. These findings may explain, in part, why social functioning is less compromised in women with schizophrenia than in men.
Familial covariation of facial emotion recognition and IQ in schizophrenia.
Andric, Sanja; Maric, Nadja P; Mihaljevic, Marina; Mirjanic, Tijana; van Os, Jim
2016-12-30
Alterations in general intellectual ability and social cognition in schizophrenia are core features of the disorder, evident at the illness' onset and persistent throughout its course. However, previous studies examining cognitive alterations in siblings discordant for schizophrenia yielded inconsistent results. Present study aimed to investigate the nature of the association between facial emotion recognition and general IQ by applying genetically sensitive cross-trait cross-sibling design. Participants (total n=158; patients, unaffected siblings, controls) were assessed using the Benton Facial Recognition Test, the Degraded Facial Affect Recognition Task (DFAR) and the Wechsler Adult Intelligence Scale-III. Patients had lower IQ and altered facial emotion recognition in comparison to other groups. Healthy siblings and controls did not significantly differ in IQ and DFAR performance, but siblings exhibited intermediate angry facial expression recognition. Cross-trait within-subject analyses showed significant associations between overall DFAR performance and IQ in all participants. Within-trait cross-sibling analyses found significant associations between patients' and siblings' IQ and overall DFAR performance, suggesting their familial clustering. Finally, cross-trait cross-sibling analyses revealed familial covariation of facial emotion recognition and IQ in siblings discordant for schizophrenia, further indicating their familial etiology. Both traits are important phenotypes for genetic studies and potential early clinical markers of schizophrenia-spectrum disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
State-dependent alteration in face emotion recognition in depression.
Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William
2011-04-01
Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.
Mittermeier, Verena; Leicht, Gregor; Karch, Susanne; Hegerl, Ulrich; Möller, Hans-Jürgen; Pogarell, Oliver; Mulert, Christoph
2011-03-01
Several studies suggest that attention to emotional content is related to specific changes in central information processing. In particular, event-related potential (ERP) studies focusing on emotion recognition in pictures and faces or word processing have pointed toward a distinct component of the visual-evoked potential, the EPN ('early posterior negativity'), which has been shown to be related to attention to emotional content. In the present study, we were interested in the existence of a corresponding ERP component in the auditory modality and a possible relationship with the personality dimension extraversion-introversion, as assessed by the NEO Five-Factors Inventory. We investigated 29 healthy subjects using three types of auditory choice tasks: (1) the distinction of syllables with emotional intonation, (2) the identification of the emotional content of adjectives and (3) a purely cognitive control task. Compared with the cognitive control task, emotional paradigms using auditory stimuli evoked an EPN component with a distinct peak after 170 ms (EPN 170). Interestingly, subjects with high scores in the personality trait extraversion showed significantly higher EPN amplitudes for emotional paradigms (syllables and words) than introverted subjects.
End-to-End Multimodal Emotion Recognition Using Deep Neural Networks
NASA Astrophysics Data System (ADS)
Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos
2017-12-01
Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.
Second Language Ability and Emotional Prosody Perception
Bhatara, Anjali; Laukka, Petri; Boll-Avetisyan, Natalie; Granjon, Lionel; Anger Elfenbein, Hillary; Bänziger, Tanja
2016-01-01
The present study examines the effect of language experience on vocal emotion perception in a second language. Native speakers of French with varying levels of self-reported English ability were asked to identify emotions from vocal expressions produced by American actors in a forced-choice task, and to rate their pleasantness, power, alertness and intensity on continuous scales. Stimuli included emotionally expressive English speech (emotional prosody) and non-linguistic vocalizations (affect bursts), and a baseline condition with Swiss-French pseudo-speech. Results revealed effects of English ability on the recognition of emotions in English speech but not in non-linguistic vocalizations. Specifically, higher English ability was associated with less accurate identification of positive emotions, but not with the interpretation of negative emotions. Moreover, higher English ability was associated with lower ratings of pleasantness and power, again only for emotional prosody. This suggests that second language skills may sometimes interfere with emotion recognition from speech prosody, particularly for positive emotions. PMID:27253326
Seeing Life through Positive-Tinted Glasses: Color–Meaning Associations
Gil, Sandrine; Le Bigot, Ludovic
2014-01-01
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning. PMID:25098167
Seeing life through positive-tinted glasses: color-meaning associations.
Gil, Sandrine; Le Bigot, Ludovic
2014-01-01
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue-meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.
Advanced Parkinson disease patients have impairment in prosody processing.
Albuquerque, Luisa; Martins, Maurício; Coelho, Miguel; Guedes, Leonor; Ferreira, Joaquim J; Rosa, Mário; Martins, Isabel Pavão
2016-01-01
The ability to recognize and interpret emotions in others is a crucial prerequisite of adequate social behavior. Impairments in emotion processing have been reported from the early stages of Parkinson's disease (PD). This study aims to characterize emotion recognition in advanced Parkinson's disease (APD) candidates for deep-brain stimulation and to compare emotion recognition abilities in visual and auditory domains. APD patients, defined as those with levodopa-induced motor complications (N = 42), and healthy controls (N = 43) matched by gender, age, and educational level, undertook the Comprehensive Affect Testing System (CATS), a battery that evaluates recognition of seven basic emotions (happiness, sadness, anger, fear, surprise, disgust, and neutral) on facial expressions and four emotions on prosody (happiness, sadness, anger, and fear). APD patients were assessed during the "ON" state. Group performance was compared with independent-samples t tests. Compared to controls, APD had significantly lower scores on the discrimination and naming of emotions in prosody, and visual discrimination of neutral faces, but no significant differences in visual emotional tasks. The contrasting performance in emotional processing between visual and auditory stimuli suggests that APD candidates for surgery have either a selective difficulty in recognizing emotions in prosody or a general defect in prosody processing. Studies investigating early-stage PD, and the effect of subcortical lesions in prosody processing, favor the latter interpretation. Further research is needed to understand these deficits in emotional prosody recognition and their possible contribution to later behavioral or neuropsychiatric manifestations of PD.
Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T
2012-12-01
Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize the mouth feature holistically and the eyes as isolated parts. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.
Zsoldos, Isabella; Cousin, Emilie; Klein-Koerkamp, Yanica; Pichat, Cédric; Hot, Pascal
2016-11-01
Age-related differences in neural correlates underlying implicit and explicit emotion processing are unclear. Within the framework of the Frontoamygdalar Age-related Differences in Emotion model (St Jacques et al., 2009), our objectives were to examine the behavioral and neural modifications that occur with age for both processes. During explicit and implicit processing of fearful faces, we expected to observe less amygdala activity in older adults (OA) than in younger adults (YA), associated with poorer recognition performance in the explicit task, and more frontal activity during implicit processing, suggesting compensation. At a behavioral level, explicit recognition of fearful faces was impaired in OA compared with YA. We did not observe any cerebral differences between OA and YA during the implicit task, whereas in the explicit task, OA recruited more frontal, parietal, temporal, occipital, and cingulate areas. Our findings suggest that automatic processing of emotion may be preserved during aging, whereas deliberate processing is impaired. Additional neural recruitment in OA did not appear to compensate for their behavioral deficits. Copyright © 2016 Elsevier B.V. All rights reserved.
Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei
2017-05-01
Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lee, Jung Suk; Chun, Ji Won; Kang, Jee In; Kang, Dong-Il; Park, Hae-Jeong; Kim, Jae-Jin
2012-07-30
Emotional memory dysfunction may be associated with anhedonia in schizophrenia. This study aimed to investigate the neurobiological basis of emotional memory and its relationship with anhedonia in schizophrenia specifically in emotional memory relate brain regions of interest (ROIs) including the amygdala, hippocampus, nucleus accumbens, and ventromedial prefrontal cortex. Fourteen patients with schizophrenia and 16 healthy subjects performed a word-image associative encoding task, during which a neutral word was presented with a positive, neutral, or control image. Subjects underwent functional magnetic resonance imaging while performing the recognition task. Correlation analyses were performed between the percent signal change (PSC) in the ROIs and the anhedonia scores. We found no group differences in recognition accuracy and reaction time. The PSC of the hippocampus in the positive and neutral conditions, and the PSC in the nucleus accumbens in the control condition, appeared to be negatively correlated with the Physical Anhedonia Scale (PAS) scores in patients with schizophrenia, while significant correlations with the PAS scores were not observed in healthy subjects. This study provides further evidences of the role of the hippocampus and nucleus accumbens in trait physical anhedonia and possible associations between emotional memory deficit and trait physical anhedonia in patients with schizophrenia. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Emotion processing biases and resting EEG activity in depressed adolescents
Auerbach, Randy P.; Stewart, Jeremy G.; Stanton, Colin H.; Mueller, Erik M.; Pizzagalli, Diego A.
2015-01-01
Background While theorists have posited that adolescent depression is characterized by emotion processing biases (greater propensity to identify sad than happy facial expressions), findings have been mixed. Additionally, the neural correlates associated with putative emotion processing biases remain largely unknown. Our aim was to identify emotion processing biases in depressed adolescents and examine neural abnormalities related to these biases using high-density resting EEG and source localization. Methods Healthy (n = 36) and depressed (n = 23) female adolescents, aged 13–18 years, completed a facial recognition task in which they identified happy, sad, fear, and angry expressions across intensities from 10% (low) to 100% (high). Additionally, 128-channel resting (i.e., task-free) EEG was recorded and analyzed using a distributed source localization technique (LORETA). Given research implicating the dorsolateral prefrontal cortex (DLPFC) in depression and emotion processing, analyses focused on this region. Results Relative to healthy youth, depressed adolescents were more accurate for sad and less accurate for happy, particularly low-intensity happy faces. No differences emerged for fearful or angry facial expressions. Further, LORETA analyses revealed greater theta and alpha current density (i.e., reduced brain activity) in depressed versus healthy adolescents, particularly in the left DLPFC (BA9/BA46). Theta and alpha current density were positively correlated, and greater current density predicted reduced accuracy for happy faces. Conclusion Depressed female adolescents were characterized by emotion processing biases in favor of sad emotions and reduced recognition of happiness, especially when cues of happiness were subtle. Blunted recognition of happy was associated with left DLPFC resting hypoactivity. PMID:26032684
Social cognition in schizophrenia: cognitive and affective factors.
Ziv, Ido; Leiser, David; Levine, Joseph
2011-01-01
Social cognition refers to how people conceive, perceive, and draw inferences about mental and emotional states of others in the social world. Previous studies suggest that the concept of social cognition involves several abilities, including those related to affect and cognition. The present study analyses the deficits of individuals with schizophrenia in two areas of social cognition: Theory of Mind (ToM) and emotion recognition and processing. Examining the impairment of these abilities in patients with schizophrenia has the potential to elucidate the neurophysiological regions involved in social cognition and may also have the potential to aid rehabilitation. Two experiments were conducted. Both included the same five tasks: first- and second-level false-belief ToM tasks, emotion inferencing, understanding of irony, and matrix reasoning (a WAIS-R subtest). The matrix reasoning task was administered to evaluate and control for the association of the other tasks with analytic reasoning skills. Experiment 1 involved factor analysis of the task performance of 75 healthy participants. Experiment 2 compared 30 patients with schizophrenia to an equal number of matched controls. Results. (1) The five tasks were clearly divided into two factors corresponding to the two areas of social cognition, ToM and emotion recognition and processing. (2) Schizophrenics' performance was impaired on all tasks, particularly on those loading heavily on the analytic component (matrix reasoning and second-order ToM). (3) Matrix reasoning, second-level ToM (ToM2), and irony were found to distinguish patients from controls, even when all other tasks that revealed significant impairment in the patients' performance were taken into account. The two areas of social cognition examined are related to distinct factors. The mechanism for answering ToM questions (especially ToM2) depends on analytic reasoning capabilities, but the difficulties they present to individuals with schizophrenia are due to other components as well. The impairment in social cognition in schizophrenia stems from deficiencies in several mechanisms, including the ability to think analytically and to process emotion information and cues.
The effect of the social regulation of emotion on emotional long-term memory.
Flores, Luis E; Berenbaum, Howard
2017-04-01
Memories for emotional events tend to be stronger than for neutral events, and weakening negative memories can be helpful to promote well-being. The present study examined whether the social regulation of emotion (in the form of handholding) altered the strength of emotional long-term memory. A sample of 219 undergraduate students viewed sets of negative, neutral, and positive images. Each participant held a stress ball while viewing half of the images and held someone's hand while viewing the other half. Participants returned 1 week later to complete a recognition task. Performance on the recognition task demonstrated that participants had lower memory accuracy for negative but not for positive pictures that were shown while they were holding someone's hand compared with when they were holding a stress ball. Although handholding altered the strength of negative emotional long-term memory, it did not down-regulate negative affective response as measured by self-report or facial expressivity. The present findings provide evidence that the social regulation of emotion can help weaken memory for negative information. Given the role of strong negative memories in different forms of psychopathology (e.g., depression, posttraumatic stress disorder), these findings may help better understand how close relationships protect against psychopathology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Impaired recognition of body expressions in the behavioral variant of frontotemporal dementia.
Van den Stock, Jan; De Winter, François-Laurent; de Gelder, Beatrice; Rangarajan, Janaki Raman; Cypers, Gert; Maes, Frederik; Sunaert, Stefan; Goffin, Karolien; Vandenberghe, Rik; Vandenbulcke, Mathieu
2015-08-01
Progressive deterioration of social cognition and emotion processing are core symptoms of the behavioral variant of frontotemporal dementia (bvFTD). Here we investigate whether bvFTD is also associated with impaired recognition of static (Experiment 1) and dynamic (Experiment 2) bodily expressions. In addition, we compared body expression processing with processing of static (Experiment 3) and dynamic (Experiment 4) facial expressions, as well as with face identity processing (Experiment 5). The results reveal that bvFTD is associated with impaired recognition of static and dynamic bodily and facial expressions, while identity processing was intact. No differential impairments were observed regarding motion (static vs. dynamic) or category (body vs. face). Within the bvFTD group, we observed a significant partial correlation between body and face expression recognition, when controlling for performance on the identity task. Voxel-Based Morphometry (VBM) analysis revealed that body emotion recognition was positively associated with gray matter volume in a region of the inferior frontal gyrus (pars orbitalis/triangularis). The results are in line with a supramodal emotion recognition deficit in bvFTD. Copyright © 2015 Elsevier Ltd. All rights reserved.
Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John
2014-10-01
Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.
Kanakam, Natalie; Krug, Isabel; Raoult, Charlotte; Collier, David; Treasure, Janet
2013-07-01
Emotional processing difficulties are potential risk markers for eating disorders that are also present after recovery. The aim of this study was to examine these traits in twins with eating disorders. The Reading the Mind in the Eyes test, Emotional Stroop task and the Difficulties in Emotion Regulation Scale were administered to 112 twins with and without eating disorders (DSM IV-TR eating disorder criteria). Generalised estimating equations compared twins with eating disorders against unaffected co-twins and control twins, and within-pair correlations were calculated for clinical monozygotic (n = 50) and dizygotic twins (n = 20). Emotion recognition difficulties, attentional biases to social threat and difficulties in emotion regulation were greater in twins with eating disorders, and some were present in their unaffected twin siblings. Evidence for a possible genetic basis was highest for emotion recognition and attentional biases to social stimuli. Emotion recognition difficulties and sensitivity to social threat appear to be endophenotypes associated with eating disorders. However, the limited statistical power means that these findings are tentative and require further replication. Copyright © 2013 John Wiley & Sons, Ltd and Eating Disorders Association.
Actively paranoid patients with schizophrenia over attribute anger to neutral faces.
Pinkham, Amy E; Brensinger, Colleen; Kohler, Christian; Gur, Raquel E; Gur, Ruben C
2011-02-01
Previous investigations of the influence of paranoia on facial affect recognition in schizophrenia have been inconclusive as some studies demonstrate better performance for paranoid relative to non-paranoid patients and others show that paranoid patients display greater impairments. These studies have been limited by small sample sizes and inconsistencies in the criteria used to define groups. Here, we utilized an established emotion recognition task and a large sample to examine differential performance in emotion recognition ability between patients who were actively paranoid (AP) and those who were not actively paranoid (NAP). Accuracy and error patterns on the Penn Emotion Recognition test (ER40) were examined in 132 patients (64 NAP and 68 AP). Groups were defined based on the presence of paranoid ideation at the time of testing rather than diagnostic subtype. AP and NAP patients did not differ in overall task accuracy; however, an emotion by group interaction indicated that AP patients were significantly worse than NAP patients at correctly labeling neutral faces. A comparison of error patterns on neutral stimuli revealed that the groups differed only in misattributions of anger expressions, with AP patients being significantly more likely to misidentify a neutral expression as angry. The present findings suggest that paranoia is associated with a tendency to over attribute threat to ambiguous stimuli and also lend support to emerging hypotheses of amygdala hyperactivation as a potential neural mechanism for paranoid ideation. Copyright © 2010 Elsevier B.V. All rights reserved.
Verdejo-García, Antonio; Albein-Urios, Natalia; Molina, Esther; Ching-López, Ana; Martínez-González, José M; Gutiérrez, Blanca
2013-11-01
Based on previous evidence of a MAOA gene*cocaine use interaction on orbitofrontal cortex volume attrition, we tested whether the MAOA low activity variant and cocaine use severity are interactively associated with impulsivity and behavioral indices of orbitofrontal dysfunction: emotion recognition and decision-making. 72 cocaine dependent individuals and 52 non-drug using controls (including healthy individuals and problem gamblers) were genotyped for the MAOA gene and tested using the UPPS-P Impulsive Behavior Scale, the Iowa Gambling Task and the Ekman's Facial Emotions Recognition Test. To test the main hypothesis, we conducted hierarchical multiple regression analyses including three sets of predictors: (1) age, (2) MAOA genotype and severity of cocaine use, and (3) the interaction between MAOA genotype and severity of cocaine use. UPPS-P, Ekman Test and Iowa Gambling Task's scores were the outcome measures. We computed the statistical significance of the prediction change yielded by each consecutive set, with 'a priori' interest in the MAOA*cocaine severity interaction. We found significant effects of the MAOA gene*cocaine use severity interaction on the emotion recognition scores and the UPPS-P's dimensions of Positive Urgency and Sensation Seeking: Low activity carriers with higher cocaine exposure had poorer emotion recognition and higher Positive Urgency and Sensation Seeking. Cocaine users carrying the MAOA low activity show a greater impact of cocaine use on impulsivity and behavioral measures of orbitofrontal cortex dysfunction. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan
2016-06-30
This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Neural substrates of interpreting actions and emotions from body postures.
Kana, Rajesh K; Travers, Brittany G
2012-04-01
Accurately reading the body language of others may be vital for navigating the social world, and this ability may be influenced by factors, such as our gender, personality characteristics and neurocognitive processes. This fMRI study examined the brain activation of 26 healthy individuals (14 women and 12 men) while they judged the action performed or the emotion felt by stick figure characters appearing in different postures. In both tasks, participants activated areas associated with visual representation of the body, motion processing and emotion recognition. Behaviorally, participants demonstrated greater ease in judging the physical actions of the characters compared to judging their emotional states, and participants showed more activation in areas associated with emotion processing in the emotion detection task, whereas they showed more activation in visual, spatial and action-related areas in the physical action task. Gender differences emerged in brain responses, such that men showed greater activation than women in the left dorsal premotor cortex in both tasks. Finally, participants higher in self-reported empathy demonstrated greater activation in areas associated with self-referential processing and emotion interpretation. These results suggest that empathy levels and sex of the participant may affect neural responses to emotional body language.
Neural substrates of interpreting actions and emotions from body postures
Travers, Brittany G.
2012-01-01
Accurately reading the body language of others may be vital for navigating the social world, and this ability may be influenced by factors, such as our gender, personality characteristics and neurocognitive processes. This fMRI study examined the brain activation of 26 healthy individuals (14 women and 12 men) while they judged the action performed or the emotion felt by stick figure characters appearing in different postures. In both tasks, participants activated areas associated with visual representation of the body, motion processing and emotion recognition. Behaviorally, participants demonstrated greater ease in judging the physical actions of the characters compared to judging their emotional states, and participants showed more activation in areas associated with emotion processing in the emotion detection task, whereas they showed more activation in visual, spatial and action-related areas in the physical action task. Gender differences emerged in brain responses, such that men showed greater activation than women in the left dorsal premotor cortex in both tasks. Finally, participants higher in self-reported empathy demonstrated greater activation in areas associated with self-referential processing and emotion interpretation. These results suggest that empathy levels and sex of the participant may affect neural responses to emotional body language. PMID:21504992
Can emotion recognition be taught to children with autism spectrum conditions?
Baron-Cohen, Simon; Golan, Ofer; Ashwin, Emma
2009-01-01
Children with autism spectrum conditions (ASC) have major difficulties in recognizing and responding to emotional and mental states in others' facial expressions. Such difficulties in empathy underlie their social-communication difficulties that form a core of the diagnosis. In this paper we ask whether aspects of empathy can be taught to young children with ASC. We review a study that evaluated The Transporters, an animated series designed to enhance emotion comprehension in children with ASC. Children with ASC (4–7 years old) watched The Transporters every day for four weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three levels of generalization. The intervention group improved significantly more than a clinical control group on all task levels, performing comparably to typical controls at time 2. The discussion centres on how vehicles as mechanical systems may be one key reason why The Transporters caused the improved understanding and recognition of emotions in children with ASC. The implications for the design of autism-friendly interventions are also explored. PMID:19884151
Loukusa, Soile; Mäkinen, Leena; Kuusikko-Gauffin, Sanna; Ebeling, Hanna; Moilanen, Irma
2014-01-01
Social perception skills, such as understanding the mind and emotions of others, affect children's communication abilities in real-life situations. In addition to autism spectrum disorder (ASD), there is increasing knowledge that children with specific language impairment (SLI) also demonstrate difficulties in their social perception abilities. To compare the performance of children with SLI, ASD and typical development (TD) in social perception tasks measuring Theory of Mind (ToM) and emotion recognition. In addition, to evaluate the association between social perception tasks and language tests measuring word-finding abilities, knowledge of grammatical morphology and verbal working memory. Children with SLI (n = 18), ASD (n = 14) and TD (n = 25) completed two NEPSY-II subtests measuring social perception abilities: (1) Affect Recognition and (2) ToM (includes Verbal and non-verbal Contextual tasks). In addition, children's word-finding abilities were measured with the TWF-2, grammatical morphology by using the Grammatical Closure subtest of ITPA, and verbal working memory by using subtests of Sentence Repetition or Word List Interference (chosen according the child's age) of the NEPSY-II. Children with ASD scored significantly lower than children with SLI or TD on the NEPSY-II Affect Recognition subtest. Both SLI and ASD groups scored significantly lower than TD children on Verbal tasks of the ToM subtest of NEPSY-II. However, there were no significant group differences on non-verbal Contextual tasks of the ToM subtest of the NEPSY-II. Verbal tasks of the ToM subtest were correlated with the Grammatical Closure subtest and TWF-2 in children with SLI. In children with ASD correlation between TWF-2 and ToM: Verbal tasks was moderate, almost achieving statistical significance, but no other correlations were found. Both SLI and ASD groups showed difficulties in tasks measuring verbal ToM but differences were not found in tasks measuring non-verbal Contextual ToM. The association between Verbal ToM tasks and language tests was stronger in children with SLI than in children with ASD. There is a need for further studies in order to understand interaction between different areas of language and cognitive development. © 2014 Royal College of Speech and Language Therapists.
The level of cognitive function and recognition of emotions in older adults
Singh-Manoux, Archana; Batty, G. David; Ebmeier, Klaus P.; Jokela, Markus; Harmer, Catherine J.; Kivimäki, Mika
2017-01-01
Background The association between cognitive decline and the ability to recognise emotions in interpersonal communication is not well understood. We aimed to investigate the association between cognitive function and the ability to recognise emotions in other people’s facial expressions across the full continuum of cognitive capacity. Methods Cross-sectional analysis of 4039 participants (3016 men, 1023 women aged 59 to 82 years) in the Whitehall II study. Cognitive function was assessed using a 30-item Mini-Mental State Examination (MMSE), further classified into 8 groups: 30, 29, 28, 27, 26, 25, 24, and <24 (possible dementia) MMSE points. The Facial Expression Recognition Task (FERT) was used to examine recognition of anger, fear, disgust, sadness, and happiness. Results The multivariable adjusted difference in the percentage of accurate recognition between the highest and lowest MMSE group was 14.9 (95%CI, 11.1–18.7) for anger, 15.5 (11.9–19.2) for fear, 18.5 (15.2–21.8) for disgust, 11.6 (7.3–16.0) for sadness, and 6.3 (3.1–9.4) for happiness. However, recognition of several emotions was reduced already after 1 to 2-point reduction in MMSE and with further points down in MMSE, the recognition worsened at an accelerated rate. Conclusions The ability to recognize emotion in facial expressions is affected at an early stage of cognitive impairment and might decline at an accelerated rate with the deterioration of cognitive function. Accurate recognition of happiness seems to be less affected by a severe decline in cognitive performance than recognition of negatively valued emotions. PMID:28977015
Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.
Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M
2014-09-01
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.
Narme, Pauline; Mouras, Harold; Roussel, Martine; Duru, Cécile; Krystkowiak, Pierre; Godefroy, Olivier
2013-03-01
Parkinson's disease (PD) is associated with behavioral disorders that can affect social functioning but are poorly understood. Since emotional and cognitive social processes are known to be crucial in social relationships, impairment of these processes may account for the emergence of behavioral disorders. We used a systematic battery of tests to assess emotional processes and social cognition in PD patients and relate our findings to conventional neuropsychological data (especially behavioral disorders). Twenty-three PD patients and 46 controls (matched for age and educational level) were included in the study and underwent neuropsychological testing, including an assessment of the behavioral and cognitive components of executive function. Emotional and cognitive social processes were assessed with the Interpersonal Reactivity Index caregiver-administered questionnaire (as a measure of empathy), a facial emotion recognition task and two theory of mind (ToM) tasks. When compared with controls, PD patients showed low levels of empathy (p = .006), impaired facial emotion recognition (which persisted after correction for perceptual abilities) (p = .001), poor performance in a second-order ToM task (p = .008) that assessed both cognitive (p = .004) and affective (p = .03) inferences and, lastly, frequent dysexecutive behavioral disorders (in over 40% of the patients). Overall, impaired emotional and cognitive social functioning was observed in 17% of patients and was related to certain cognitive dysexecutive disorders. In terms of behavioral dysexecutive disorders, social behavior disorders were related to impaired emotional and cognitive social functioning (p = .04) but were independent of cognitive impairments. Emotional and cognitive social processes were found to be impaired in Parkinson's disease. This impairment may account for the emergence of social behavioral disorders. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan
2014-01-01
Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.
Derntl, Birgit; Habel, Ute; Windischberger, Christian; Robinson, Simon; Kryspin-Exner, Ilse; Gur, Ruben C; Moser, Ewald
2009-08-04
The ability to recognize emotions in facial expressions relies on an extensive neural network with the amygdala as the key node as has typically been demonstrated for the processing of fearful stimuli. A sufficient characterization of the factors influencing and modulating amygdala function, however, has not been reached now. Due to lacking or diverging results on its involvement in recognizing all or only certain negative emotions, the influence of gender or ethnicity is still under debate. This high-resolution fMRI study addresses some of the relevant parameters, such as emotional valence, gender and poser ethnicity on amygdala activation during facial emotion recognition in 50 Caucasian subjects. Stimuli were color photographs of emotional Caucasian and African American faces. Bilateral amygdala activation was obtained to all emotional expressions (anger, disgust, fear, happy, and sad) and neutral faces across all subjects. However, only in males a significant correlation of amygdala activation and behavioral response to fearful stimuli was observed, indicating higher amygdala responses with better fear recognition, thus pointing to subtle gender differences. No significant influence of poser ethnicity on amygdala activation occurred, but analysis of recognition accuracy revealed a significant impact of poser ethnicity that was emotion-dependent. Applying high-resolution fMRI while subjects were performing an explicit emotion recognition task revealed bilateral amygdala activation to all emotions presented and neutral expressions. This mechanism seems to operate similarly in healthy females and males and for both in-group and out-group ethnicities. Our results support the assumption that an intact amygdala response is fundamental in the processing of these salient stimuli due to its relevance detecting function.
Arruti, Andoni; Cearreta, Idoia; Álvarez, Aitor; Lazkano, Elena; Sierra, Basilio
2014-01-01
Study of emotions in human–computer interaction is a growing research area. This paper shows an attempt to select the most significant features for emotion recognition in spoken Basque and Spanish Languages using different methods for feature selection. RekEmozio database was used as the experimental data set. Several Machine Learning paradigms were used for the emotion classification task. Experiments were executed in three phases, using different sets of features as classification variables in each phase. Moreover, feature subset selection was applied at each phase in order to seek for the most relevant feature subset. The three phases approach was selected to check the validity of the proposed approach. Achieved results show that an instance-based learning algorithm using feature subset selection techniques based on evolutionary algorithms is the best Machine Learning paradigm in automatic emotion recognition, with all different feature sets, obtaining a mean of 80,05% emotion recognition rate in Basque and a 74,82% in Spanish. In order to check the goodness of the proposed process, a greedy searching approach (FSS-Forward) has been applied and a comparison between them is provided. Based on achieved results, a set of most relevant non-speaker dependent features is proposed for both languages and new perspectives are suggested. PMID:25279686
Spalek, Klara; Fastenrath, Matthias; Ackermann, Sandra; Auschra, Bianca; Coynel, David; Frey, Julia; Gschwind, Leo; Hartmann, Francina; van der Maarel, Nadine; Papassotiropoulos, Andreas; de Quervain, Dominique; Milnik, Annette
2015-01-21
Extensive evidence indicates that women outperform men in episodic memory tasks. Furthermore, women are known to evaluate emotional stimuli as more arousing than men. Because emotional arousal typically increases episodic memory formation, the females' memory advantage might be more pronounced for emotionally arousing information than for neutral information. Here, we report behavioral data from 3398 subjects, who performed picture rating and memory tasks, and corresponding fMRI data from up to 696 subjects. We were interested in the interaction between sex and valence category on emotional appraisal, memory performances, and fMRI activity. The behavioral results showed that females evaluate in particular negative (p < 10(-16)) and positive (p = 2 × 10(-4)), but not neutral pictures, as emotionally more arousing (pinteraction < 10(-16)) than males. However, in the free recall females outperformed males not only in positive (p < 10(-16)) and negative (p < 5 × 10(-5)), but also in neutral picture recall (p < 3.4 × 10(-8)), with a particular advantage for positive pictures (pinteraction < 4.4 × 10(-10)). Importantly, females' memory advantage during free recall was absent in a recognition setting. We identified activation differences in fMRI, which corresponded to the females' stronger appraisal of especially negative pictures, but no activation differences that reflected the interaction effect in the free recall memory task. In conclusion, females' valence-category-specific memory advantage is only observed in a free recall, but not a recognition setting and does not depend on females' higher emotional appraisal. Copyright © 2015 the authors 0270-6474/15/350920-16$15.00/0.
Neural Correlates of Explicit versus Implicit Facial Emotion Processing in ASD
ERIC Educational Resources Information Center
Luckhardt, Christina; Kröger, Anne; Cholemkery, Hannah; Bender, Stephan; Freitag, Christine M.
2017-01-01
The underlying neural mechanisms of implicit and explicit facial emotion recognition (FER) were studied in children and adolescents with autism spectrum disorder (ASD) compared to matched typically developing controls (TDC). EEG was obtained from N = 21 ASD and N = 16 TDC. Task performance, visual (P100, N170) and cognitive (late positive…
Psychopaths lack the automatic avoidance of social threat: relation to instrumental aggression.
Louise von Borries, Anna Katinka; Volman, Inge; de Bruijn, Ellen Rosalia Aloïs; Bulten, Berend Hendrik; Verkes, Robbert Jan; Roelofs, Karin
2012-12-30
Psychopathy (PP) is associated with marked abnormalities in social emotional behaviour, such as high instrumental aggression (IA). A crucial but largely ignored question is whether automatic social approach-avoidance tendencies may underlie this condition. We tested whether offenders with PP show lack of automatic avoidance tendencies, usually activated when (healthy) individuals are confronted with social threat stimuli (angry faces). We applied a computerized approach-avoidance task (AAT), where participants pushed or pulled pictures of emotional faces using a joystick, upon which the faces decreased or increased in size, respectively. Furthermore, participants completed an emotion recognition task which was used to control for differences in recognition of facial emotions. In contrast to healthy controls (HC), PP patients showed total absence of avoidance tendencies towards angry faces. Interestingly, those responses were related to levels of instrumental aggression and the (in)ability to experience personal distress (PD). These findings suggest that social performance in psychopaths is disturbed on a basic level of automatic action tendencies. The lack of implicit threat avoidance tendencies may underlie their aggressive behaviour. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Visser-Keizer, Annemarie C.; Westerhof-Evers, Herma J.; Gerritsen, Marleen J. J.; van der Naalt, Joukje; Spikman, Jacoba M.
2016-01-01
Fear is an important emotional reaction that guides decision making in situations of ambiguity or uncertainty. Both recognition of facial expressions of fear and decision making ability can be impaired after traumatic brain injury (TBI), in particular when the frontal lobe is damaged. So far, it has not been investigated how recognition of fear influences risk behavior in healthy subjects and TBI patients. The ability to recognize fear is thought to be related to the ability to experience fear and to use it as a warning signal to guide decision making. We hypothesized that a better ability to recognize fear would be related to a better regulation of risk behavior, with healthy controls outperforming TBI patients. To investigate this, 59 healthy subjects and 49 TBI patients were assessed with a test for emotion recognition (Facial Expression of Emotion: Stimuli and Tests) and a gambling task (Iowa Gambling Task (IGT)). The results showed that, regardless of post traumatic amnesia duration or the presence of frontal lesions, patients were more impaired than healthy controls on both fear recognition and decision making. In both groups, a significant relationship was found between better fear recognition, the development of an advantageous strategy across the IGT and less risk behavior in the last blocks of the IGT. Educational level moderated this relationship in the final block of the IGT. This study has important clinical implications, indicating that impaired decision making and risk behavior after TBI can be preceded by deficits in the processing of fear. PMID:27870900
Visser-Keizer, Annemarie C; Westerhof-Evers, Herma J; Gerritsen, Marleen J J; van der Naalt, Joukje; Spikman, Jacoba M
2016-01-01
Fear is an important emotional reaction that guides decision making in situations of ambiguity or uncertainty. Both recognition of facial expressions of fear and decision making ability can be impaired after traumatic brain injury (TBI), in particular when the frontal lobe is damaged. So far, it has not been investigated how recognition of fear influences risk behavior in healthy subjects and TBI patients. The ability to recognize fear is thought to be related to the ability to experience fear and to use it as a warning signal to guide decision making. We hypothesized that a better ability to recognize fear would be related to a better regulation of risk behavior, with healthy controls outperforming TBI patients. To investigate this, 59 healthy subjects and 49 TBI patients were assessed with a test for emotion recognition (Facial Expression of Emotion: Stimuli and Tests) and a gambling task (Iowa Gambling Task (IGT)). The results showed that, regardless of post traumatic amnesia duration or the presence of frontal lesions, patients were more impaired than healthy controls on both fear recognition and decision making. In both groups, a significant relationship was found between better fear recognition, the development of an advantageous strategy across the IGT and less risk behavior in the last blocks of the IGT. Educational level moderated this relationship in the final block of the IGT. This study has important clinical implications, indicating that impaired decision making and risk behavior after TBI can be preceded by deficits in the processing of fear.
Besche-Richard, C; Bourrin-Tisseron, A; Olivier, M; Cuervo-Lombard, C-V; Limosin, F
2012-06-01
The deficits of recognition of facial emotions and attribution of mental states are now well-documented in schizophrenic patients. However, we don't clearly know about the link between these two complex cognitive functions, especially in schizophrenia. In this study, we attempted to test the link between the recognition of facial emotions and the capacities of mentalization, notably the attribution of beliefs, in health and schizophrenic participants. We supposed that the level of performance of recognition of facial emotions, compared to the working memory and executive functioning, was the best predictor of the capacities to attribute a belief. Twenty schizophrenic participants according to DSM-IVTR (mean age: 35.9 years, S.D. 9.07; mean education level: 11.15 years, S.D. 2.58) clinically stabilized, receiving neuroleptic or antipsychotic medication participated in the study. They were matched on age (mean age: 36.3 years, S.D. 10.9) and educational level (mean educational level: 12.10, S.D. 2.25) with 30 matched healthy participants. All the participants were evaluated with a pool of tasks testing the recognition of facial emotions (the faces of Baron-Cohen), the attribution of beliefs (two stories of first order and two stories of second order), the working memory (the digit span of the WAIS-III and the Corsi test) and the executive functioning (Trail Making Test A et B, Wisconsin Card Sorting Test brief version). Comparing schizophrenic and healthy participants, our results confirmed a difference between the performances of the recognition of facial emotions and those of the attribution of beliefs. The result of the simple linear regression showed that the recognition of facial emotions, compared to the performances of working memory and executive functioning, was the best predictor of the performances in the theory of mind stories. Our results confirmed, in a sample of schizophrenic patients, the deficits in the recognition of facial emotions and in the attribution of mental states. Our new result concerned the demonstration that the performances in the recognition of facial emotions are the best predictor of the performances in the attribution of beliefs. With Marshall et al.'s model on empathy, we can explain this link between the recognition of facial emotions and the comprehension of beliefs. Copyright © 2011 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Memory bias in health anxiety is related to the emotional valence of health-related words.
Ferguson, Eamonn; Moghaddam, Nima G; Bibby, Peter A
2007-03-01
A model based on the associative strength of object evaluations is tested to explain why those who score higher on health anxiety have a better memory for health-related words. Sixty participants observed health and nonhealth words. A recognition memory task followed a free recall task and finally subjects provided evaluations (emotionality, imageability, and frequency) for all the words. Hit rates for health words, d', c, and psychological response times (PRTs) for evaluations were examined using multi-level modelling (MLM) and regression. Health words had a higher hit rate, which was greater for those with higher levels of health anxiety. The higher hit rate for health words is partly mediated by the extent to which health words are evaluated as emotionally unpleasant, and this was stronger for (moderated by) those with higher levels of health anxiety. Consistent with the associative strength model, those with higher levels of health anxiety demonstrated faster PRTs when making emotional evaluations of health words compared to nonhealth words, while those lower in health anxiety were slower to evaluate health words. Emotional evaluations speed the recognition of health words for high health anxious individuals. These findings are discussed with respect to the wider literature on cognitive processes in health anxiety, automatic processing, implicit attitudes, and emotions in decision making.
Isomura, Tomoko; Ogawa, Shino; Yamada, Satoko; Shibasaki, Masahiro; Masataka, Nobuo
2014-01-01
Previous studies have demonstrated that angry faces capture humans' attention more rapidly than emotionally positive faces. This phenomenon is referred to as the anger superiority effect (ASE). Despite atypical emotional processing, adults and children with Autism Spectrum Disorders (ASD) have been reported to show ASE as well as typically developed (TD) individuals. So far, however, few studies have clarified whether or not the mechanisms underlying ASE are the same for both TD and ASD individuals. Here, we tested how TD and ASD children process schematic emotional faces during detection by employing a recognition task in combination with a face-in-the-crowd task. Results of the face-in-the-crowd task revealed the prevalence of ASE both in TD and ASD children. However, the results of the recognition task revealed group differences: In TD children, detection of angry faces required more configural face processing and disrupted the processing of local features. In ASD children, on the other hand, it required more feature-based processing rather than configural processing. Despite the small sample sizes, these findings provide preliminary evidence that children with ASD, in contrast to TD children, show quick detection of angry faces by extracting local features in faces. PMID:24904477
Briggs-Gowan, Margaret J.; Voss, Joel L.; Petitclerc, Amelie; McCarthy, Kimberly; Blair, R. James R.; Wakschlag, Lauren S.
2016-01-01
Introduction Callous-unemotional (CU) traits in the presence of conduct problems are associated with increased risk of severe antisocial behavior. Developmentally sensitive methods of assessing CU traits have recently been generated, but their construct validity in relation to neurocognitive underpinnings of CU has not been demonstrated. The current study sought to investigate whether the fear-specific emotion recognition deficits associated with CU traits in older individuals are developmentally expressed in young children as low concern for others and punishment insensitivity. Methods A sub-sample of 337 preschoolers (mean age 4.8 years [SD=.8]) who completed neurocognitive tasks was taken from a larger project of preschool psychopathology. Children completed an emotional recognition task in which they were asked to identify the emotional face from the neutral faces in an array. CU traits were assessed using the Low Concern (LC) and Punishment Insensitivity (PI) subscales of the Multidimensional Assessment Profile of Disruptive Behavior (MAP-DB), which were specifically designed to differentiate the normative misbehavior of early childhood from atypical patterns. Results High LC, but not PI, scores were associated with a fear-specific deficit in emotion recognition. Girls were more accurate than boys in identifying emotional expressions but no significant interaction between LC or PI and sex was observed. Conclusions Fear recognition deficits associated with CU traits in older individuals were observed in preschoolers with developmentally-defined patterns of low concern for others. Confirming that the link between CU-related impairments in empathy and distinct neurocognitive deficits is present in very young children suggests that developmentally-specified measurement can detect the substrates of these severe behavioral patterns beginning much earlier than prior work. Exploring the development of CU traits and disruptive behavior disorders at very early ages may provide insights critical to early intervention and prevention of severe antisocial behavior. PMID:27167866
Visual body recognition in a prosopagnosic patient.
Moro, V; Pernigo, S; Avesani, R; Bulgarelli, C; Urgesi, C; Candidi, M; Aglioti, S M
2012-01-01
Conspicuous deficits in face recognition characterize prosopagnosia. Information on whether agnosic deficits may extend to non-facial body parts is lacking. Here we report the neuropsychological description of FM, a patient affected by a complete deficit in face recognition in the presence of mild clinical signs of visual object agnosia. His deficit involves both overt and covert recognition of faces (i.e. recognition of familiar faces, but also categorization of faces for gender or age) as well as the visual mental imagery of faces. By means of a series of matching-to-sample tasks we investigated: (i) a possible association between prosopagnosia and disorders in visual body perception; (ii) the effect of the emotional content of stimuli on the visual discrimination of faces, bodies and objects; (iii) the existence of a dissociation between identity recognition and the emotional discrimination of faces and bodies. Our results document, for the first time, the co-occurrence of body agnosia, i.e. the visual inability to discriminate body forms and body actions, and prosopagnosia. Moreover, the results show better performance in the discrimination of emotional face and body expressions with respect to body identity and neutral actions. Since FM's lesions involve bilateral fusiform areas, it is unlikely that the amygdala-temporal projections explain the relative sparing of emotion discrimination performance. Indeed, the emotional content of the stimuli did not improve the discrimination of their identity. The results hint at the existence of two segregated brain networks involved in identity and emotional discrimination that are at least partially shared by face and body processing. Copyright © 2011 Elsevier Ltd. All rights reserved.
Identifying and detecting facial expressions of emotion in peripheral vision.
Smith, Fraser W; Rossit, Stephanie
2018-01-01
Facial expressions of emotion are signals of high biological value. Whilst recognition of facial expressions has been much studied in central vision, the ability to perceive these signals in peripheral vision has only seen limited research to date, despite the potential adaptive advantages of such perception. In the present experiment, we investigate facial expression recognition and detection performance for each of the basic emotions (plus neutral) at up to 30 degrees of eccentricity. We demonstrate, as expected, a decrease in recognition and detection performance with increasing eccentricity, with happiness and surprised being the best recognized expressions in peripheral vision. In detection however, while happiness and surprised are still well detected, fear is also a well detected expression. We show that fear is a better detected than recognized expression. Our results demonstrate that task constraints shape the perception of expression in peripheral vision and provide novel evidence that detection and recognition rely on partially separate underlying mechanisms, with the latter more dependent on the higher spatial frequency content of the face stimulus.
Identifying and detecting facial expressions of emotion in peripheral vision
Rossit, Stephanie
2018-01-01
Facial expressions of emotion are signals of high biological value. Whilst recognition of facial expressions has been much studied in central vision, the ability to perceive these signals in peripheral vision has only seen limited research to date, despite the potential adaptive advantages of such perception. In the present experiment, we investigate facial expression recognition and detection performance for each of the basic emotions (plus neutral) at up to 30 degrees of eccentricity. We demonstrate, as expected, a decrease in recognition and detection performance with increasing eccentricity, with happiness and surprised being the best recognized expressions in peripheral vision. In detection however, while happiness and surprised are still well detected, fear is also a well detected expression. We show that fear is a better detected than recognized expression. Our results demonstrate that task constraints shape the perception of expression in peripheral vision and provide novel evidence that detection and recognition rely on partially separate underlying mechanisms, with the latter more dependent on the higher spatial frequency content of the face stimulus. PMID:29847562
Neutral and emotional episodic memory: global impairment after lorazepam or scopolamine.
Kamboj, Sunjeev K; Curran, H Valerie
2006-11-01
Benzodiazepines and anticholinergic drugs have repeatedly been shown to impair episodic memory for emotionally neutral material in humans. However, their effect on memory for emotionally laden stimuli has been relatively neglected. We sought to investigate the effects of the benzodiazepine, lorazepam, and the anticholinergic, scopolamine, on incidental episodic memory for neutral and emotional components of a narrative memory task in humans. A double-blind, placebo-controlled independent group design was used with 48 healthy volunteers to examine the effects of these drugs on emotional and neutral episodic memory. As expected, the emotional memory advantage was retained for recall and recognition memory under placebo conditions. However, lorazepam and scopolamine produced anterograde recognition memory impairments on both the neutral and emotional components of the narrative, although floor effects were obtained for recall memory. Furthermore, compared with placebo, recognition memory for both central (gist) and peripheral (detail) aspects of neutral and emotional elements of the narrative was poorer after either drug. Benzodiazepine-induced GABAergic enhancement or scopolamine-induced cholinergic hypofunction results in a loss of the enhancing effect of emotional arousal on memory. Furthermore, lorazepam- and scopolamine-induced memory impairment for both gist (which is amygdala dependent) and detail raises the possibility that their effects on emotional memory do not depend only on the amygdala. We discuss the results with reference to potential clinical/forensic implications of processing emotional memories under conditions of globally impaired episodic memory.
Belkaid, Marwen; Cuperlier, Nicolas; Gaussier, Philippe
2017-01-01
Emotions play a significant role in internal regulatory processes. In this paper, we advocate four key ideas. First, novelty detection can be grounded in the sensorimotor experience and allow higher order appraisal. Second, cognitive processes, such as those involved in self-assessment, influence emotional states by eliciting affects like boredom and frustration. Third, emotional processes such as those triggered by self-assessment influence attentional processes. Last, close emotion-cognition interactions implement an efficient feedback loop for the purpose of top-down behavior regulation. The latter is what we call 'Emotional Metacontrol'. We introduce a model based on artificial neural networks. This architecture is used to control a robotic system in a visual search task. The emotional metacontrol intervenes to bias the robot visual attention during active object recognition. Through a behavioral and statistical analysis, we show that this mechanism increases the robot performance and fosters the exploratory behavior to avoid deadlocks.
Seidel, Eva-Maria; Habel, Ute; Finkelmeyer, Andreas; Hasmann, Alexander; Dobmeier, Matthias; Derntl, Birgit
2012-03-01
Endophenotypes are intermediate phenotypes which are considered a more promising marker of genetic risk than illness itself. While previous research mostly used cognitive deficits, emotional functions are of greater relevance for bipolar disorder regarding the characteristic emotional hyper-reactability and deficient social-emotional competence. Hence, the aim of the present study was to clarify whether empathic abilities can serve as a possible endophenotype of bipolar disorder by applying a newly developed task in bipolar patients and their first-degree relatives. Three components of empathy (emotion recognition, perspective taking and affective responsiveness) have been assessed in a sample of 21 bipolar patients, 21 first-degree relatives and 21 healthy controls. Data analysis indicated significant differences between controls and patients for emotion recognition and affective responsiveness but not for perspective taking. This shows that in addition to difficulties in recognizing facial emotional expressions, bipolar patients have difficulties in identifying emotions they would experience in a given situation. However, the ability to take the perspective of another person in an emotional situation was intact but decreased with increasing severity of residual hypomanic and depressive symptoms. Relatives performed comparably bad on emotion recognition but did not differ from controls or patients in affective responsiveness. This study is the first to show that deficient emotion recognition is the only component of empathy which forms a possible endophenotype of bipolar disorder. This has important implications for prevention strategies. Furthermore, changes in affective responsiveness in first-degree relatives show a potential resilience marker. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Uono, Shota; Sato, Wataru; Toichi, Motomi
2013-01-01
This study was designed to identify specific difficulties and associated features related to the problems with social interaction experienced by individuals with pervasive developmental disorder-not otherwise specified (PDD-NOS) using an emotion-recognition task. We compared individuals with PDD-NOS or Asperger's disorder (ASP) and typically…
ERIC Educational Resources Information Center
Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan
2014-01-01
Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…
When Early Experiences Build a Wall to Others’ Emotions: An Electrophysiological and Autonomic Study
Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Sestito, Mariateresa; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions. PMID:23593374
Prototype-Incorporated Emotional Neural Network.
Oyedotun, Oyebade K; Khashman, Adnan
2017-08-15
Artificial neural networks (ANNs) aim to simulate the biological neural activities. Interestingly, many ''engineering'' prospects in ANN have relied on motivations from cognition and psychology studies. So far, two important learning theories that have been subject of active research are the prototype and adaptive learning theories. The learning rules employed for ANNs can be related to adaptive learning theory, where several examples of the different classes in a task are supplied to the network for adjusting internal parameters. Conversely, the prototype-learning theory uses prototypes (representative examples); usually, one prototype per class of the different classes contained in the task. These prototypes are supplied for systematic matching with new examples so that class association can be achieved. In this paper, we propose and implement a novel neural network algorithm based on modifying the emotional neural network (EmNN) model to unify the prototype- and adaptive-learning theories. We refer to our new model as ``prototype-incorporated EmNN''. Furthermore, we apply the proposed model to two real-life challenging tasks, namely, static hand-gesture recognition and face recognition, and compare the result to those obtained using the popular back-propagation neural network (BPNN), emotional BPNN (EmNN), deep networks, an exemplar classification model, and k-nearest neighbor.
Emotion-induced impairments in speeded word recognition tasks.
Zeelenberg, René; Bocanegra, Bruno R; Pecher, Diane
2011-01-01
Recent studies show that emotional stimuli impair the identification of subsequently presented, briefly flashed stimuli. In the present study, we investigated whether emotional distractors (primes) impaired target processing when presentation of the target stimulus was not impoverished. In lexical decision, animacy decision, rhyme decision, and nonword naming, targets were presented in such a manner that they were clearly visible (i.e., targets were not masked and presented until participants responded). In all tasks taboo-sexual distractors caused a slowdown in responding to the subsequent neutral target. Our results indicate that the detrimental effects of emotional distractors are not confined to paradigms in which visibility of the target is limited. Moreover, impairments were obtained even when semantic processing of stimuli was not required.
Marx, Ivo; Krause, John; Berger, Christoph; Häßler, Frank
2014-01-01
Objectives To effectively manage current task demands, attention must be focused on task-relevant information while task-irrelevant information is rejected. However, in everyday life, people must cope with emotions, which may interfere with actual task demands and may challenge functional attention allocation. Control of interfering emotions has been associated with the proper functioning of the dorsolateral prefrontal cortex (DLPFC). As DLPFC dysfunction is evident in subjects with ADHD and in subjects with alcohol dependence, the current study sought to examine the bottom-up effect of emotional distraction on task performance in both disorders. Methods Male adults with ADHD (n = 22), male adults with alcohol dependence (n = 16), and healthy controls (n = 30) performed an emotional working memory task (n-back task). In the background of the task, we presented neutral and negative stimuli that varied in emotional saliency. Results In both clinical groups, a working memory deficit was evident. Moreover, both clinical groups displayed deficient emotional interference control. The n-back performance of the controls was not affected by the emotional distractors, whereas that of subjects with ADHD deteriorated in the presence of low salient distractors, and that of alcoholics did not deteriorate until high salient distractors were presented. Subsequent to task performance, subjects with ADHD accurately recognized more distractors than did alcoholics and controls. In alcoholics, picture recognition accuracy was negatively associated with n-back performance, suggesting a functional association between the ability to suppress emotional distractors and successful task performance. In subjects with ADHD, performance accuracy was negatively associated with ADHD inattentive symptoms, suggesting that inattention contributes to the performance deficit. Conclusions Subjects with ADHD and alcoholics both display an emotional interference control deficit, which is especially pronounced in subjects with ADHD. Beyond dysfunctional attention allocation processes, a more general attention deficit seems to contribute to the more pronounced performance deficit pattern in ADHD. PMID:25265290
ERIC Educational Resources Information Center
Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato
2010-01-01
The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…
Music-induced changes in functional cerebral asymmetries.
Hausmann, Markus; Hodgetts, Sophie; Eerola, Tuomas
2016-04-01
After decades of research, it remains unclear whether emotion lateralization occurs because one hemisphere is dominant for processing the emotional content of the stimuli, or whether emotional stimuli activate lateralised networks associated with the subjective emotional experience. By using emotion-induction procedures, we investigated the effect of listening to happy and sad music on three well-established lateralization tasks. In a prestudy, Mozart's piano sonata (K. 448) and Beethoven's Moonlight Sonata were rated as the most happy and sad excerpts, respectively. Participants listened to either one emotional excerpt, or sat in silence before completing an emotional chimeric faces task (Experiment 1), visual line bisection task (Experiment 2) and a dichotic listening task (Experiment 3 and 4). Listening to happy music resulted in a reduced right hemispheric bias in facial emotion recognition (Experiment 1) and visuospatial attention (Experiment 2) and increased left hemispheric bias in language lateralization (Experiments 3 and 4). Although Experiments 1-3 revealed an increased positive emotional state after listening to happy music, mediation analyses revealed that the effect on hemispheric asymmetries was not mediated by music-induced emotional changes. The direct effect of music listening on lateralization was investigated in Experiment 4 in which tempo of the happy excerpt was manipulated by controlling for other acoustic features. However, the results of Experiment 4 made it rather unlikely that tempo is the critical cue accounting for the effects. We conclude that listening to music can affect functional cerebral asymmetries in well-established emotional and cognitive laterality tasks, independent of music-induced changes in the emotion state. Copyright © 2016 Elsevier Inc. All rights reserved.
Sex differences in the ability to recognise non-verbal displays of emotion: a meta-analysis.
Thompson, Ashley E; Voyer, Daniel
2014-01-01
The present study aimed to quantify the magnitude of sex differences in humans' ability to accurately recognise non-verbal emotional displays. Studies of relevance were those that required explicit labelling of discrete emotions presented in the visual and/or auditory modality. A final set of 551 effect sizes from 215 samples was included in a multilevel meta-analysis. The results showed a small overall advantage in favour of females on emotion recognition tasks (d=0.19). However, the magnitude of that sex difference was moderated by several factors, namely specific emotion, emotion type (negative, positive), sex of the actor, sensory modality (visual, audio, audio-visual) and age of the participants. Method of presentation (computer, slides, print, etc.), type of measurement (response time, accuracy) and year of publication did not significantly contribute to variance in effect sizes. These findings are discussed in the context of social and biological explanations of sex differences in emotion recognition.
LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality
Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E
2016-01-01
Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy. PMID:27249781
LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality.
Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E
2016-10-01
Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy.
Hua, Alice Y; Sible, Isabel J; Perry, David C; Rankin, Katherine P; Kramer, Joel H; Miller, Bruce L; Rosen, Howard J; Sturm, Virginia E
2018-01-01
Behavioral variant frontotemporal dementia (bvFTD) is a neurodegenerative disease characterized by profound changes in emotions and empathy. Although most patients with bvFTD become less sensitive to negative emotional cues, some patients become more sensitive to positive emotional stimuli. We investigated whether dysregulated positive emotions in bvFTD undermine empathy by making it difficult for patients to share (emotional empathy), recognize (cognitive empathy), and respond (real-world empathy) to emotions in others. Fifty-one participants (26 patients with bvFTD and 25 healthy controls) viewed photographs of neutral, positive, negative, and self-conscious emotional faces and then identified the emotions displayed in the photographs. We used facial electromyography to measure automatic, sub-visible activity in two facial muscles during the task: Zygomaticus major ( ZM ), which is active during positive emotional reactions (i.e., smiling), and Corrugator supercilii ( CS ), which is active during negative emotional reactions (i.e., frowning). Participants rated their baseline positive and negative emotional experience before the task, and informants rated participants' real-world empathic behavior on the Interpersonal Reactivity Index. The majority of participants also underwent structural magnetic resonance imaging. A mixed effects model found a significant diagnosis X trial interaction: patients with bvFTD showed greater ZM reactivity to neutral, negative (disgust and surprise), self-conscious (proud), and positive (happy) faces than healthy controls. There was no main effect of diagnosis or diagnosis X trial interaction on CS reactivity. Compared to healthy controls, patients with bvFTD had impaired emotion recognition. Multiple regression analyses revealed that greater ZM reactivity predicted worse negative emotion recognition and worse real-world empathy. At baseline, positive emotional experience was higher in bvFTD than healthy controls and also predicted worse negative emotion recognition. Voxel-based morphometry analyses found that smaller volume in the thalamus, midcingulate cortex, posterior insula, anterior temporal pole, amygdala, precentral gyrus, and inferior frontal gyrus-structures that support emotion generation, interoception, and emotion regulation-was associated with greater ZM reactivity in bvFTD. These findings suggest that dysregulated positive emotional reactivity may relate to reduced empathy in bvFTD by making patients less likely to tune their reactions to the social context and to share, recognize, and respond to others' feelings and needs.
Melkonian, Alexander J; Ham, Lindsay S; Bridges, Ana J; Fugitt, Jessica L
2017-10-01
High rates of sexual victimization among college students necessitate further study of factors associated with sexual assault risk detection. The present study examined how social information processing relates to sexual assault risk detection as a function of sexual assault victimization history. 225 undergraduates (M age = 19.12, SD = 1.44; 66% women). Participants completed an online questionnaire assessing victimization history, an emotion identification task, and a sexual assault risk detection task between June 2013 and May 2014. Emotion identification moderated the association between victimization history and risk detection such that sexual assault survivors with lower emotion identification accuracy also reported the least risk in a sexual assault vignette. Findings suggest that differences in social information processing, specifically recognition of others' emotions, are associated with sexual assault risk detection. College prevention programs could incorporate emotional awareness strategies, particularly for men and women who are sexual assault survivors.
Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A.; Ravera, Roberto; Gallese, Vittorio
2017-01-01
One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims’ recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims’ performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions. PMID:28690565
Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A; Ravera, Roberto; Gallese, Vittorio
2017-01-01
One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims' recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims' performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions.
Emotion recognition in girls with conduct problems.
Schwenck, Christina; Gensthaler, Angelika; Romanos, Marcel; Freitag, Christine M; Schneider, Wolfgang; Taurines, Regina
2014-01-01
A deficit in emotion recognition has been suggested to underlie conduct problems. Although several studies have been conducted on this topic so far, most concentrated on male participants. The aim of the current study was to compare recognition of morphed emotional faces in girls with conduct problems (CP) with elevated or low callous-unemotional (CU+ vs. CU-) traits and a matched healthy developing control group (CG). Sixteen girls with CP-CU+, 16 girls with CP-CU- and 32 controls (mean age: 13.23 years, SD=2.33 years) were included. Video clips with morphed faces were presented in two runs to assess emotion recognition. Multivariate analysis of variance with the factors group and run was performed. Girls with CP-CU- needed more time than the CG to encode sad, fearful, and happy faces and they correctly identified sadness less often. Girls with CP-CU+ outperformed the other groups in the identification of fear. Learning effects throughout runs were the same for all groups except that girls with CP-CU- correctly identified fear less often in the second run compared to the first run. Results need to be replicated with comparable tasks, which might result in subgroup-specific therapeutic recommendations.
Balconi, M; Cobelli, C
2015-02-26
The present research explored the cortical correlates of emotional memories in response to words and pictures. Subjects' performance (Accuracy Index, AI; response times, RTs; RTs/AI) was considered when a repetitive Transcranial Magnetic Stimulation (rTMS) was applied on the left dorsolateral prefrontal cortex (LDLPFC). Specifically, the role of LDLPFC was tested by performing a memory task, in which old (previously encoded targets) and new (previously not encoded distractors) emotional pictures/words had to be recognized. Valence (positive vs. negative) and arousing power (high vs. low) of stimuli were also modulated. Moreover, subjective evaluation of emotional stimuli in terms of valence/arousal was explored. We found significant performance improving (higher AI, reduced RTs, improved general performance) in response to rTMS. This "better recognition effect" was only related to specific emotional features, that is positive high arousal pictures or words. Moreover no significant differences were found between stimulus categories. A direct relationship was also observed between subjective evaluation of emotional cues and memory performance when rTMS was applied to LDLPFC. Supported by valence and approach model of emotions, we supposed that a left lateralized prefrontal system may induce a better recognition of positive high arousal words, and that evaluation of emotional cue is related to prefrontal activation, affecting the recognition memories of emotions. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.
Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W
2016-05-03
Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, P<0.05). There was no significant difference in performance accuracy or reaction time between active and placebo conditions. To the best of our knowledge, this study provides the first evidence suggesting that adjunctive raloxifene treatment changes neural activity in brain regions associated with facial emotion recognition in schizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.
Bat-Pitault, F; Da Fonseca, D; Flori, S; Porcher-Guinet, V; Stagnara, C; Patural, H; Franco, P; Deruelle, C
2017-10-01
The emotional process is characterized by a negative bias in depression, thus it was legitimate to establish if they same is true in very young at-risk children. Furthermore, sleep, also proposed as a marker of the depression risk, is closely linked in adults and adolescents with emotions. That is why we wanted first to better describe the characteristics of emotional recognition by 3-year-olds and their links with sleep. Secondly we observed, if found at this young age, an emotional recognition pattern indicating a vulnerability to depression. We studied, in 133 children aged 36 months from the AuBE cohort, the number of correct answers to the task of recognition of facial emotions (joy, anger and sadness). Cognitive functions were also assessed by the WPPSI III at 3 years old, and the different sleep parameters (time of light off and light on, sleep times, difficulty to go to sleep and number of parents' awakes per night) were described by questionnaires filled out by mothers at 6, 12, 18, 24 and 36 months after birth. Of these 133 children, 21 children whose mothers had at least one history of depression (13 boys) were the high-risk group and 19 children (8 boys) born to women with no history of depression were the low-risk group (or control group). Overall, 133 children by the age of 36 months recognize significantly better happiness than other emotions (P=0.000) with a better global recognition higher in girls (M=8.8) than boys (M=7.8) (P=0.013) and a positive correlation between global recognition ability and verbal IQ (P=0.000). Children who have less daytime sleep at 18 months and those who sleep less at 24 months show a better recognition of sadness (P=0.043 and P=0.042); those with difficulties at bedtime at 18 months recognize less happiness (P=0.043), and those who awaken earlier at 24 months have a better global recognition of emotions (P=0.015). Finally, the boys of the high-risk group recognize sadness better than boys in the control group (P=0.015). This study confirms that the recognition of emotion is related to development with a female advantage and a link with the language skills at 36 months of life. More importantly, we found a relationship between sleep characteristics and emotional recognition ability and a negative bias in emotional recognition in young males at risk for depression. Copyright © 2016 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Xie, Weizhen; Cappiello, Marcus; Meng, Ming; Rosenthal, Robert; Zhang, Weiwei
2018-05-08
This meta-analytical review examines whether a deletion variant in ADRA2B, a gene that encodes α 2B adrenoceptor in the regulation of norepinephrine availability, influences cognitive processing of emotional information in human observers. Using a multilevel modeling approach, this meta-analysis of 16 published studies with a total of 2,752 participants showed that ADRA2B deletion variant was significantly associated with enhanced perceptual and cognitive task performance for emotional stimuli. In contrast, this genetic effect did not manifest in overall task performance when non-emotional content was used. Furthermore, various study-level factors, such as targeted cognitive processes (memory vs. attention/perception) and task procedures (recall vs. recognition), could moderate the size of this genetic effect. Overall, with increased statistical power and standardized analytical procedures, this meta-analysis has established the contributions of ADRA2B to the interactions between emotion and cognition, adding to the growing literature on individual differences in attention, perception, and memory for emotional information in the general population. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Development of the Young Child's Representation of Emotion.
ERIC Educational Resources Information Center
Ireson, Judith M.; Shields, Maureen
The purpose of this study was to trace the development of children's understanding of emotions between the ages of 4 and 12. Twenty-four children at each of five age groups (4, 6, 8, 10, and 12), equally divided by sex, served as subjects. Three groups of tasks were administered: (1) the recognition of facial expressions from photographs, (2) the…
Svärd, Joakim; Wiens, Stefan; Fischer, Håkan
2012-01-01
In the aging literature it has been shown that even though emotion recognition performance decreases with age, the decrease is less for happiness than other facial expressions. Studies in younger adults have also revealed that happy faces are more strongly attended to and better recognized than other emotional facial expressions. Thus, there might be a more age independent happy face advantage in facial expression recognition. By using a backward masking paradigm and varying stimulus onset asynchronies (17–267 ms) the temporal development of a happy face advantage, on a continuum from low to high levels of visibility, was examined in younger and older adults. Results showed that across age groups, recognition performance for happy faces was better than for neutral and fearful faces at durations longer than 50 ms. Importantly, the results showed a happy face advantage already during early processing of emotional faces in both younger and older adults. This advantage is discussed in terms of processing of salient perceptual features and elaborative processing of the happy face. We also investigate the combined effect of age and neuroticism on emotional face processing. The rationale was previous findings of age-related differences in physiological arousal to emotional pictures and a relation between arousal and neuroticism. Across all durations, there was an interaction between age and neuroticism, showing that being high in neuroticism might be disadvantageous for younger, but not older adults’ emotion recognition performance during arousal enhancing tasks. These results indicate that there is a relation between aging, neuroticism, and performance, potentially related to physiological arousal. PMID:23226135
Kim, Youl-Ri; Eom, Jin-Sup; Yang, Jae-Won; Kang, Jiwon; Treasure, Janet
2015-01-01
Social difficulties and problems related to eating behaviour are common features of both anorexia nervosa (AN) and bulimia nervosa (BN). The aim of this study was to examine the impact of intranasal oxytocin on consummatory behaviour and emotional recognition in patients with AN and BN in comparison to healthy controls. A total of 102 women, including 35 patients with anorexia nervosa (AN), 34 patients with bulimia nervosa (BN), and 33 healthy university students of comparable age and intelligence, participated in a double-blind, single dose placebo-controlled cross-over study. A single dose of intranasal administration of oxytocin (40 IU) (or a placebo) was followed by an emotional recognition task and an apple juice drink. Food intake was then recorded for 24 hours post-test. Oxytocin produced no significant change in appetite in the acute or 24 hours free living settings in healthy controls, whereas there was a decrease in calorie consumption over 24 hours in patients with BN. Oxytocin produced a small increase in emotion recognition sensitivity in healthy controls and in patients with BN, In patients with AN, oxytocin had no effect on emotion recognition sensitivity or on consummatory behaviour. The impact of oxytocin on appetite and social cognition varied between people with AN and BN. A single dose of intranasal oxytocin decreased caloric intake over 24 hours in people with BN. People with BN showed enhanced emotional sensitivity under oxytocin condition similar to healthy controls. Those effects of oxytocin were not found in patients with AN. ClinicalTrials.gov KCT00000716.
De Los Reyes, Andres; Lerner, Matthew D; Thomas, Sarah A; Daruwala, Samantha; Goepel, Katherine
2013-08-01
Parents and children and adolescents commonly disagree in their perceptions of a variety of behaviors, including the family relationship and environment, and child and adolescent psychopathology. To this end, numerous studies have examined to what extent increased discrepant perceptions-particularly with regard to perceptions of the family relationship and environment-predict increased child and adolescent psychopathology. Parents' and children and adolescents' abilities to decode and identify others' emotions (i.e., emotion recognition) may play a role in the link between discrepant perceptions and child and adolescent psychopathology. We examined parents' and adolescents' emotion recognition abilities in relation to discrepancies between parent and adolescent perceptions of daily life topics. In a sample of 50 parents and adolescents ages 14-to-17 years (M = 15.4 years, 20 males, 54 % African-American), parents and adolescents were each administered a widely used performance-based measure of emotion recognition. Parents and adolescents were also administered a structured interview designed to directly assess each of their perceptions of the extent to which discrepancies existed in their beliefs about daily life topics (e.g., whether adolescents should complete their homework and carry out household chores). Interestingly, lower parent and adolescent emotion recognition performance significantly related to greater parent and adolescent perceived discrepant beliefs about daily life topics. We observed this relation whilst accounting for adolescent age and gender and levels of parent-adolescent conflict. These findings have important implications for understanding and using informant discrepancies in both basic developmental psychopathology research and applied research in clinic settings (e.g., discrepant views on therapeutic goals).
Emotion categorization of body expressions in narrative scenarios
Volkova, Ekaterina P.; Mohler, Betty J.; Dodds, Trevor J.; Tesch, Joachim; Bülthoff, Heinrich H.
2014-01-01
Humans can recognize emotions expressed through body motion with high accuracy even when the stimuli are impoverished. However, most of the research on body motion has relied on exaggerated displays of emotions. In this paper we present two experiments where we investigated whether emotional body expressions could be recognized when they were recorded during natural narration. Our actors were free to use their entire body, face, and voice to express emotions, but our resulting visual stimuli used only the upper body motion trajectories in the form of animated stick figures. Observers were asked to perform an emotion recognition task on short motion sequences using a large and balanced set of emotions (amusement, joy, pride, relief, surprise, anger, disgust, fear, sadness, shame, and neutral). Even with only upper body motion available, our results show recognition accuracy significantly above chance level and high consistency rates among observers. In our first experiment, that used more classic emotion induction setup, all emotions were well recognized. In the second study that employed narrations, four basic emotion categories (joy, anger, fear, and sadness), three non-basic emotion categories (amusement, pride, and shame) and the “neutral” category were recognized above chance. Interestingly, especially in the second experiment, observers showed a bias toward anger when recognizing the motion sequences for emotions. We discovered that similarities between motion sequences across the emotions along such properties as mean motion speed, number of peaks in the motion trajectory and mean motion span can explain a large percent of the variation in observers' responses. Overall, our results show that upper body motion is informative for emotion recognition in narrative scenarios. PMID:25071623
Jurado-Berbel, Patricia; Costa-Miserachs, David; Torras-Garcia, Meritxell; Coll-Andreu, Margalida; Portell-Cortés, Isabel
2010-02-11
The present work examined whether post-training systemic epinephrine (EPI) is able to modulate short-term (3h) and long-term (24 h and 48 h) memory of standard object recognition, as well as long-term (24 h) memory of separate "what" (object identity) and "where" (object location) components of object recognition. Although object recognition training is associated to low arousal levels, all the animals received habituation to the training box in order to further reduce emotional arousal. Post-training EPI improved long-term (24 h and 48 h), but not short-term (3 h), memory in the standard object recognition task, as well as 24 h memory for both object identity and object location. These data indicate that post-training epinephrine: (1) facilitates long-term memory for standard object recognition; (2) exerts separate facilitatory effects on "what" (object identity) and "where" (object location) components of object recognition; and (3) is capable of improving memory for a low arousing task even in highly habituated rats.
NASA Astrophysics Data System (ADS)
Li, Ji; Ren, Fuji
Weblogs have greatly changed the communication ways of mankind. Affective analysis of blog posts is found valuable for many applications such as text-to-speech synthesis or computer-assisted recommendation. Traditional emotion recognition in text based on single-label classification can not satisfy higher requirements of affective computing. In this paper, the automatic identification of sentence emotion in weblogs is modeled as a multi-label text categorization task. Experiments are carried out on 12273 blog sentences from the Chinese emotion corpus Ren_CECps with 8-dimension emotion annotation. An ensemble algorithm RAKEL is used to recognize dominant emotions from the writer's perspective. Our emotion feature using detailed intensity representation for word emotions outperforms the other main features such as the word frequency feature and the traditional lexicon-based feature. In order to deal with relatively complex sentences, we integrate grammatical characteristics of punctuations, disjunctive connectives, modification relations and negation into features. It achieves 13.51% and 12.49% increases for Micro-averaged F1 and Macro-averaged F1 respectively compared to the traditional lexicon-based feature. Result shows that multiple-dimension emotion representation with grammatical features can efficiently classify sentence emotion in a multi-label problem.
Cognitive contributions to theory of mind ability in children with a traumatic head injury.
Levy, Naomi Kahana; Milgram, Noach
2016-01-01
The objective of the current study is to examine the contribution of intellectual abilities, executive functions (EF), and facial emotion recognition to difficulties in Theory of Mind (ToM) abilities in children with a traumatic head injury. Israeli children with a traumatic head injury were compared with their non-injured counterparts. Each group included 18 children (12 males) ages 7-13. Measurements included reading the mind in the eyes, facial emotion recognition, reasoning the other's characteristics based on motive and outcome, Raven's Coloured Progressive Matrices, similarities and digit span (Wechsler Intelligence Scale for Children - Revised 95 subscales), verbal fluency, and the Behaviour Rating Inventory of Executive Functions. Non-injured children performed significantly better on ToM, abstract reasoning, and EF measures compared with children with a traumatic head injury. However, differences in ToM abilities between the groups were no longer significant after controlling for abstract reasoning, working memory, verbal fluency, or facial emotion recognition. Impaired ToM recognition and reasoning abilities after a head injury may result from other cognitive impairments. In children with mild and moderate head injury, poorer performance on ToM tasks may reflect poorer abstract reasoning, a general tendency to concretize stimuli, working memory and verbal fluency deficits, and difficulties in facial emotion recognition, rather than deficits in the ability to understand the other's thoughts and emotions. ToM impairments may be secondary to a range of cognitive deficits in determining social outcomes in this population.
Almeida, Inês; van Asselen, Marieke; Castelo-Branco, Miguel
2013-09-01
In human cognition, most relevant stimuli, such as faces, are processed in central vision. However, it is widely believed that recognition of relevant stimuli (e.g. threatening animal faces) at peripheral locations is also important due to their survival value. Moreover, task instructions have been shown to modulate brain regions involved in threat recognition (e.g. the amygdala). In this respect it is also controversial whether tasks requiring explicit focus on stimulus threat content vs. implicit processing differently engage primitive subcortical structures involved in emotional appraisal. Here we have addressed the role of central vs. peripheral processing in the human amygdala using animal threatening vs. non-threatening face stimuli. First, a simple animal face recognition task with threatening and non-threatening animal faces, as well as non-face control stimuli, was employed in naïve subjects (implicit task). A subsequent task was then performed with the same stimulus categories (but different stimuli) in which subjects were told to explicitly detect threat signals. We found lateralized amygdala responses both to the spatial location of stimuli and to the threatening content of faces depending on the task performed: the right amygdala showed increased responses to central compared to left presented stimuli specifically during the threat detection task, while the left amygdala was better prone to discriminate threatening faces from non-facial displays during the animal face recognition task. Additionally, the right amygdala responded to faces during the threat detection task but only when centrally presented. Moreover, we have found no evidence for superior responses of the amygdala to peripheral stimuli. Importantly, we have found that striatal regions activate differentially depending on peripheral vs. central processing of threatening faces. Accordingly, peripheral processing of these stimuli activated more strongly the putaminal region, while central processing engaged mainly the caudate nucleus. We conclude that the human amygdala has a central bias for face stimuli, and that visual processing recruits different striatal regions, putaminal or caudate based, depending on the task and on whether peripheral or central visual processing is involved. © 2013 Elsevier Ltd. All rights reserved.
Negative ion treatment increases positive emotional processing in seasonal affective disorder.
Harmer, C J; Charles, M; McTavish, S; Favaron, E; Cowen, P J
2012-08-01
Antidepressant drug treatments increase the processing of positive compared to negative affective information early in treatment. Such effects have been hypothesized to play a key role in the development of later therapeutic responses to treatment. However, it is unknown whether these effects are a common mechanism of action for different treatment modalities. High-density negative ion (HDNI) treatment is an environmental manipulation that has efficacy in randomized clinical trials in seasonal affective disorder (SAD). The current study investigated whether a single session of HDNI treatment could reverse negative affective biases seen in seasonal depression using a battery of emotional processing tasks in a double-blind, placebo-controlled randomized study. Under placebo conditions, participants with seasonal mood disturbance showed reduced recognition of happy facial expressions, increased recognition memory for negative personality characteristics and increased vigilance to masked presentation of negative words in a dot-probe task compared to matched healthy controls. Negative ion treatment increased the recognition of positive compared to negative facial expression and improved vigilance to unmasked stimuli across participants with seasonal depression and healthy controls. Negative ion treatment also improved recognition memory for positive information in the SAD group alone. These effects were seen in the absence of changes in subjective state or mood. These results are consistent with the hypothesis that early change in emotional processing may be an important mechanism for treatment action in depression and suggest that these effects are also apparent with negative ion treatment in seasonal depression.
The effect of mild acute stress during memory consolidation on emotional recognition memory.
Corbett, Brittany; Weinberg, Lisa; Duarte, Audrey
2017-11-01
Stress during consolidation improves recognition memory performance. Generally, this memory benefit is greater for emotionally arousing stimuli than neutral stimuli. The strength of the stressor also plays a role in memory performance, with memory performance improving up to a moderate level of stress and thereafter worsening. As our daily stressors are generally minimal in strength, we chose to induce mild acute stress to determine its effect on memory performance. In the current study, we investigated if mild acute stress during consolidation improves memory performance for emotionally arousing images. To investigate this, we had participants encode highly arousing negative, minimally arousing negative, and neutral images. We induced stress using the Montreal Imaging Stress Task (MIST) in half of the participants and a control task to the other half of the participants directly after encoding (i.e. during consolidation) and tested recognition 48h later. We found no difference in memory performance between the stress and control group. We found a graded pattern among confidence, with responders in the stress group having the least amount of confidence in their hits and controls having the most. Across groups, we found highly arousing negative images were better remembered than minimally arousing negative or neutral images. Although stress did not affect memory accuracy, responders, as defined by cortisol reactivity, were less confident in their decisions. Our results suggest that the daily stressors humans experience, regardless of their emotional affect, do not have adverse effects on memory. Copyright © 2017 Elsevier Inc. All rights reserved.
Updating schematic emotional facial expressions in working memory: Response bias and sensitivity.
Tamm, Gerly; Kreegipuu, Kairi; Harro, Jaanus; Cowan, Nelson
2017-01-01
It is unclear if positive, negative, or neutral emotional expressions have an advantage in short-term recognition. Moreover, it is unclear from previous studies of working memory for emotional faces whether effects of emotions comprise response bias or sensitivity. The aim of this study was to compare how schematic emotional expressions (sad, angry, scheming, happy, and neutral) are discriminated and recognized in an updating task (2-back recognition) in a representative sample of birth cohort of young adults. Schematic facial expressions allow control of identity processing, which is separate from expression processing, and have been used extensively in attention research but not much, until now, in working memory research. We found that expressions with a U-curved mouth (i.e., upwardly curved), namely happy and scheming expressions, favoured a bias towards recognition (i.e., towards indicating that the probe and the stimulus in working memory are the same). Other effects of emotional expression were considerably smaller (1-2% of the variance explained)) compared to a large proportion of variance that was explained by the physical similarity of items being compared. We suggest that the nature of the stimuli plays a role in this. The present application of signal detection methodology with emotional, schematic faces in a working memory procedure requiring fast comparisons helps to resolve important contradictions that have emerged in the emotional perception literature. Copyright © 2016 Elsevier B.V. All rights reserved.
Liu, Pan; Pell, Marc D
2012-12-01
To establish a valid database of vocal emotional stimuli in Mandarin Chinese, a set of Chinese pseudosentences (i.e., semantically meaningless sentences that resembled real Chinese) were produced by four native Mandarin speakers to express seven emotional meanings: anger, disgust, fear, sadness, happiness, pleasant surprise, and neutrality. These expressions were identified by a group of native Mandarin listeners in a seven-alternative forced choice task, and items reaching a recognition rate of at least three times chance performance in the seven-choice task were selected as a valid database and then subjected to acoustic analysis. The results demonstrated expected variations in both perceptual and acoustic patterns of the seven vocal emotions in Mandarin. For instance, fear, anger, sadness, and neutrality were associated with relatively high recognition, whereas happiness, disgust, and pleasant surprise were recognized less accurately. Acoustically, anger and pleasant surprise exhibited relatively high mean f0 values and large variation in f0 and amplitude; in contrast, sadness, disgust, fear, and neutrality exhibited relatively low mean f0 values and small amplitude variations, and happiness exhibited a moderate mean f0 value and f0 variation. Emotional expressions varied systematically in speech rate and harmonics-to-noise ratio values as well. This validated database is available to the research community and will contribute to future studies of emotional prosody for a number of purposes. To access the database, please contact pan.liu@mail.mcgill.ca.
Metsala, Jamie L; Galway, Tanya M; Ishaik, Galit; Barton, Veronica E
2017-07-01
Nonverbal learning disability is a childhood disorder with basic neuropsychological deficits in visuospatial processing and psychomotor coordination, and secondary impairments in academic and social-emotional functioning. This study examines emotion recognition, understanding, and regulation in a clinic-referred group of young children with nonverbal learning disabilities (NLD). These processes have been shown to be related to social competence and psychological adjustment in typically developing (TD) children. Psychosocial adjustment and social skills are also examined for this young group, and for a clinic-referred group of older children with NLD. The young children with NLD scored lower than the TD comparison group on tasks assessing recognition of happy and sad facial expressions and tasks assessing understanding of how emotions work. Children with NLD were also rated as having less adaptive regulation of their emotions. For both young and older children with NLD, internalizing and externalizing problem scales were rated higher than for the TD comparison groups, and the means of the internalizing, attention, and social problem scales were found to fall within clinically concerning ranges. Measures of attention and nonverbal intelligence did not account for the relationship between NLD and Social Problems. Social skills and NLD membership share mostly overlapping variance in accounting for internalizing problems across the sample. The results are discussed within a framework wherein social cognitive deficits, including emotion processes, have a negative impact on social competence, leading to clinically concerning levels of depression and withdrawal in this population.
Anodal tDCS targeting the right orbitofrontal cortex enhances facial expression recognition
Murphy, Jillian M.; Ridley, Nicole J.; Vercammen, Ans
2015-01-01
The orbitofrontal cortex (OFC) has been implicated in the capacity to accurately recognise facial expressions. The aim of the current study was to determine if anodal transcranial direct current stimulation (tDCS) targeting the right OFC in healthy adults would enhance facial expression recognition, compared with a sham condition. Across two counterbalanced sessions of tDCS (i.e. anodal and sham), 20 undergraduate participants (18 female) completed a facial expression labelling task comprising angry, disgusted, fearful, happy, sad and neutral expressions, and a control (social judgement) task comprising the same expressions. Responses on the labelling task were scored for accuracy, median reaction time and overall efficiency (i.e. combined accuracy and reaction time). Anodal tDCS targeting the right OFC enhanced facial expression recognition, reflected in greater efficiency and speed of recognition across emotions, relative to the sham condition. In contrast, there was no effect of tDCS to responses on the control task. This is the first study to demonstrate that anodal tDCS targeting the right OFC boosts facial expression recognition. This finding provides a solid foundation for future research to examine the efficacy of this technique as a means to treat facial expression recognition deficits, particularly in individuals with OFC damage or dysfunction. PMID:25971602
Gender differences in brain networks supporting empathy.
Schulte-Rüther, Martin; Markowitsch, Hans J; Shah, N Jon; Fink, Gereon R; Piefke, Martina
2008-08-01
Females frequently score higher on standard tests of empathy, social sensitivity, and emotion recognition than do males. It remains to be clarified, however, whether these gender differences are associated with gender specific neural mechanisms of emotional social cognition. We investigated gender differences in an emotion attribution task using functional magnetic resonance imaging. Subjects either focused on their own emotional response to emotion expressing faces (SELF-task) or evaluated the emotional state expressed by the faces (OTHER-task). Behaviorally, females rated SELF-related emotions significantly stronger than males. Across the sexes, SELF- and OTHER-related processing of facial expressions activated a network of medial and lateral prefrontal, temporal, and parietal brain regions involved in emotional perspective taking. During SELF-related processing, females recruited the right inferior frontal cortex and superior temporal sulcus stronger than males. In contrast, there was increased neural activity in the left temporoparietal junction in males (relative to females). When performing the OTHER-task, females showed increased activation of the right inferior frontal cortex while there were no differential activations in males. The data suggest that females recruit areas containing mirror neurons to a higher degree than males during both SELF- and OTHER-related processing in empathic face-to-face interactions. This may underlie facilitated emotional "contagion" in females. Together with the observation that males differentially rely on the left temporoparietal junction (an area mediating the distinction between the SELF and OTHERS) the data suggest that females and males rely on different strategies when assessing their own emotions in response to other people.
Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments.
Murphy, Karen; Ward, Zoe
2017-01-01
Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition.
Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments
Murphy, Karen; Ward, Zoe
2017-01-01
Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition. PMID:29038663
Effects of testosterone on attention and memory for emotional stimuli in male rhesus monkeys.
King, Hanna M; Kurdziel, Laura B; Meyer, Jerrold S; Lacreuse, Agnès
2012-03-01
Increasing evidence in humans and other animals suggests that testosterone (T) plays an important role in modulating emotion. We previously reported that T treatment in rhesus monkeys undergoing chemically induced hypogonadism results in increased watching time of videos depicting fights between unfamiliar conspecifics (Lacreuse et al., 2010). In the current study, we aimed to further investigate the effect of T manipulations on attention and memory for emotional stimuli in male rhesus monkeys. Six males (7 years old) were administered Depot Lupron to suppress endogenous T levels and treated with either testosterone enanthate (TE, 5 mg/kg) or oil, before crossing over to the alternate treatment. Animals were tested for 16 weeks on two computerized touchscreen tasks with both social and nonsocial emotional and neutral stimuli. The Dot-Probe task was used to measure attention, and the Delayed-Non-Matching-to-Sample task with a 1s delay (DNMS) was used to measure recognition memory for these stimuli. Performance on the two tasks was examined during each of four month-long phases: Baseline, Lupron alone, Lupron+TE and Lupron+oil. It was predicted that T administration would lead to increased attention to negative social stimuli (i.e., negative facial expressions of unfamiliar conspecifics) and would improve memory for such stimuli. We found no evidence to support these predictions. In the Dot-Probe task, an attentional bias towards negative social stimuli was observed at baseline, but T treatment did not enhance this bias. Instead, monkeys had faster response times when treated with T compared to oil, independently of the emotional valence or social relevance of stimuli, perhaps reflecting an enhancing effect of T on reward sensitivity or general arousal. In the DNMS, animals had better memory for nonsocial compared to social stimuli and showed the poorest performance in the recognition of positive facial expressions. However, T did not affect performance on the task. Thus, even though monkeys were sensitive to the social relevance and emotional valence of the stimuli in the two tasks, T manipulations had no effect on attention or memory for these stimuli. Because habituation to the stimuli may have mitigated the effect of treatment in the attentional task, we suggest that T may increase attentional biases to negative social stimuli only during early exposure to the stimuli with acute treatment or when stimuli are highly arousing (i.e., dynamically presented) with chronic treatment. In addition, the data suggest that T does not enhance working memory for emotional stimuli in young male macaques. Copyright © 2011 Elsevier Ltd. All rights reserved.
Electrophysiological correlates of encoding and retrieving emotional events.
Koenig, Stefanie; Mecklinger, Axel
2008-04-01
This study examined the impact of emotional content on encoding and retrieval processes. Event-related potentials were recorded in a source recognition memory task. During encoding, a posterior positivity for positive and negative pictures (250-450 ms) that presumably reflects attentional capturing of emotionally valenced stimuli was found. Additionally, positive events, which were also rated as less arousing than negative events, gave rise to anterior and posterior slow wave activity as compared with neutral and negative events and also showed enhanced recognition memory. It is assumed that positive low-arousing events enter controlled and elaborated encoding processes that are beneficial for recognition memory performance. The high arousal of negative events may interfere with controlled encoding mechanisms and attenuate item recognition and the quality of remembering. Moreover, topographically distinct late posterior negativities were obtained for the retrieval of the context features location and time that support the view that this component reflects processes in service of reconstructing the study episode by binding together contextual details with an item and that varies with the kind of episodic detail to be retrieved. (Copyright) 2008 APA.
The Effects of Cognitive Reappraisal and Expressive Suppression on Memory of Emotional Pictures.
Wang, Yan Mei; Chen, Jie; Han, Ben Yue
2017-01-01
In the field of emotion research, the influence of emotion regulation strategies on memory with emotional materials has been widely discussed in recent years. However, existing studies have focused exclusively on regulating negative emotion but not positive emotion. Therefore, in the present study, we investigated the influence of emotion regulation strategies for positive emotion on memory. One hundred and twenty college students were selected as participants. Emotional pictures (positive, negative and neutral) were selected from Chinese Affective Picture System (CAPS) as experimental materials. We employed a mixed, 4 (emotion regulation strategies: cognitive up-regulation, cognitive down-regulation, expressive suppression, passive viewing) × 3 (emotional pictures: positive, neutral, negative) experimental design. We investigated the influences of different emotion regulation strategies on memory performance, using free recall and recognition tasks with pictures varying in emotional content. The results showed that recognition and free recall memory performance of the cognitive reappraisal groups (up-regulation and down-regulation) were both better than that of the passive viewing group for all emotional pictures. No significant differences were reported in the two kinds of memory scores between the expressive suppression and passive viewing groups. The results also showed that the memory performance with the emotional pictures differed according to the form of memory test. For the recognition test, participants performed better with positive images than with neutral images. Free recall scores with negative images were higher than those with neutral images. These results suggest that both cognitive reappraisal regulation strategies (up-regulation and down-regulation) promoted explicit memories of the emotional content of stimuli, and the form of memory test influenced performance with emotional pictures.
Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.
Missaghi-Lakshman, M; Whissell, C
1991-06-01
67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.
Prehn, Kristin; Kazzer, Philipp; Lischke, Alexander; Heinrichs, Markus; Herpertz, Sabine C; Domes, Gregor
2013-06-01
To investigate the mechanisms by which oxytocin improves socioaffective processing, we measured behavioral and pupillometric data during a dynamic facial emotion recognition task. In a double-blind between-subjects design, 47 men received either 24 IU intranasal oxytocin (OXT) or a placebo (PLC). Participants in the OXT group recognized all facial expressions at lower intensity levels than did participants in the PLC group. Improved performance was accompanied by increased task-related pupil dilation, indicating an increased recruitment of attentional resources. We also found increased pupil dilation during the processing of female compared with male faces. This gender-specific stimulus effect diminished in the OXT group, in which pupil size specifically increased for male faces. Results suggest that improved emotion recognition after OXT treatment might be due to an intensified processing of stimuli that usually do not recruit much attention. Copyright © 2013 Society for Psychophysiological Research.
Cuperlier, Nicolas; Gaussier, Philippe
2017-01-01
Emotions play a significant role in internal regulatory processes. In this paper, we advocate four key ideas. First, novelty detection can be grounded in the sensorimotor experience and allow higher order appraisal. Second, cognitive processes, such as those involved in self-assessment, influence emotional states by eliciting affects like boredom and frustration. Third, emotional processes such as those triggered by self-assessment influence attentional processes. Last, close emotion-cognition interactions implement an efficient feedback loop for the purpose of top-down behavior regulation. The latter is what we call ‘Emotional Metacontrol’. We introduce a model based on artificial neural networks. This architecture is used to control a robotic system in a visual search task. The emotional metacontrol intervenes to bias the robot visual attention during active object recognition. Through a behavioral and statistical analysis, we show that this mechanism increases the robot performance and fosters the exploratory behavior to avoid deadlocks. PMID:28934291
Sasson, Noah J; Pinkham, Amy E; Weittenhiller, Lauren P; Faso, Daniel J; Simpson, Claire
2016-05-01
Although Schizophrenia (SCZ) and Autism Spectrum Disorder (ASD) share impairments in emotion recognition, the mechanisms underlying these impairments may differ. The current study used the novel "Emotions in Context" task to examine how the interpretation and visual inspection of facial affect is modulated by congruent and incongruent emotional contexts in SCZ and ASD. Both adults with SCZ (n= 44) and those with ASD (n= 21) exhibited reduced affect recognition relative to typically-developing (TD) controls (n= 39) when faces were integrated within broader emotional scenes but not when they were presented in isolation, underscoring the importance of using stimuli that better approximate real-world contexts. Additionally, viewing faces within congruent emotional scenes improved accuracy and visual attention to the face for controls more so than the clinical groups, suggesting that individuals with SCZ and ASD may not benefit from the presence of complementary emotional information as readily as controls. Despite these similarities, important distinctions between SCZ and ASD were found. In every condition, IQ was related to emotion-recognition accuracy for the SCZ group but not for the ASD or TD groups. Further, only the ASD group failed to increase their visual attention to faces in incongruent emotional scenes, suggesting a lower reliance on facial information within ambiguous emotional contexts relative to congruent ones. Collectively, these findings highlight both shared and distinct social cognitive processes in SCZ and ASD that may contribute to their characteristic social disabilities. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Recognition of schematic facial displays of emotion in parents of children with autism.
Palermo, Mark T; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria
2006-07-01
Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. 'Autism fathers' performed worse than 'autism mothers', who performed worse than controls in decoding displays representing sadness or disgust. This indicates the need to include facial expression decoding tasks in genetic research of autism. In addition, emotional expression interactions between parents and their children with autism, particularly through play, where affect and prosody are 'physiologically' exaggerated, may stimulate development of social competence. Future studies could benefit from a combination of stimuli including photographs and schematic drawings, with and without associated verbal categories. This may allow the subdivision of patients and relatives on the basis of the amount of information needed to understand and process social-emotionally relevant information.
Automatic Facial Expression Recognition and Operator Functional State
NASA Technical Reports Server (NTRS)
Blanson, Nina
2012-01-01
The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions
Automatic Facial Expression Recognition and Operator Functional State
NASA Technical Reports Server (NTRS)
Blanson, Nina
2011-01-01
The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.
Neural correlates of impaired emotion processing in manifest Huntington's disease.
Dogan, Imis; Saß, Christian; Mirzazade, Shahram; Kleiman, Alexandra; Werner, Cornelius J; Pohl, Anna; Schiefer, Johannes; Binkofski, Ferdinand; Schulz, Jörg B; Shah, N Jon; Reetz, Kathrin
2014-05-01
The complex phenotype of Huntington's disease (HD) encompasses motor, psychiatric and cognitive dysfunctions, including early impairments in emotion recognition. In this first functional magnetic resonance imaging study, we investigated emotion-processing deficits in 14 manifest HD patients and matched controls. An emotion recognition task comprised short video clips displaying one of six basic facial expressions (sadness, happiness, disgust, fear, anger and neutral). Structural changes between patients and controls were assessed by means of voxel-based morphometry. Along with deficient recognition of negative emotions, patients exhibited predominantly lower neural response to stimuli of negative valences in the amygdala, hippocampus, striatum, insula, cingulate and prefrontal cortices, as well as in sensorimotor, temporal and visual areas. Most of the observed reduced activity patterns could not be explained merely by regional volume loss. Reduced activity in the thalamus during fear correlated with lower thalamic volumes. During the processing of sadness, patients exhibited enhanced amygdala and hippocampal activity along with reduced recruitment of the medial prefrontal cortex. Higher amygdala activity was related to more pronounced amygdala atrophy and disease burden. Overall, the observed emotion-related dysfunctions in the context of structural neurodegeneration suggest both disruptions of striatal-thalamo-cortical loops and potential compensation mechanism with greater disease severity in manifest HD.
Lacreuse, Agnès; Gore, Heather E; Chang, Jeemin; Kaplan, Emily R
2012-05-15
The role of testosterone (T) in modulating cognitive function and emotion in men remains unclear. The paucity of animal studies has likely contributed to the slow progress in this area. In particular, studies in nonhuman primates have been lacking. Our laboratory has begun to address this issue by pharmacologically manipulating T levels in intact male rhesus monkeys, using blind, placebo-controlled, crossover designs. We previously found that T-suppressed monkeys receiving supraphysiological T for 4 weeks had lower visual recognition memory for long delays and enhanced attention to videos of negative social stimuli (Lacreuse et al., 2009, 2010) compared to when treated with oil. To further delineate the conditions under which T affects cognition and emotion, the present study focused on the short-term effects of physiological T. Six intact males were treated with the gonadotropin-releasing hormone antagonist degarelix (3 mg/kg) for 7 days and received one injection of T enanthate (5 mg/kg) followed by one injection of oil vehicle 7 days later (n=3), or the reverse treatment (n=3). Performance on two computerized tasks, the Delayed-non-matching-to-sample (DNMS) with random delays and the object-Delayed Recognition Span test (object-DRST) and one task of emotional reactivity, an approach/avoidance task of negative, familiar and novel objects, was examined at baseline and 3-5 days after treatment. DNMS performance was significantly better when monkeys were treated with T compared to oil, independently of the delay duration or the nature (emotional or neutral) of the stimuli. Performance on the object-DRST was unaffected. Interestingly, subtle changes in emotional reactivity were also observed: T administration was associated with fewer object contacts, especially on negative objects, without overt changes in anxious behaviors. These results may reflect increased vigilance and alertness with high T. Altogether, the data suggest that changes in general arousal may underlie the beneficial effects of T on DNMS performance. This hypothesis will require further study with objective measures of physiological arousal. Copyright © 2012 Elsevier Inc. All rights reserved.
Borg, Céline; Leroy, Nicolas; Favre, Emilie; Laurent, Bernard; Thomas-Antérion, Catherine
2011-06-01
The present study examines the prediction that emotion can facilitate short-term memory. Nevertheless, emotion also recruits attention to process information, thereby disrupting short-term memory when tasks involve high attentional resources. In this way, we aimed to determine whether there is a differential influence of emotional information on short-term memory in ageing and Alzheimer's disease (AD). Fourteen patients with mild AD, 14 healthy older participants (NC), and 14 younger adults (YA) performed two tasks. In the first task, involving visual short-term memory, participants were asked to remember a picture among four different pictures (negative or neutral) following a brief delay. The second task, a binding memory task, required the recognition by participants of a picture according to its spatial location. The attentional cost involved was higher than for the first task. The pattern of results showed that visual memory performance was better for negative stimuli than for neutral ones, irrespective of the group. In contrast, binding memory performance was essentially poorer for the location of negative pictures in the NC group, and for the location of both negative and neutral stimuli in the AD group, in comparison to the YA group. Taken together, these results show that emotion has beneficial effects on visual short-term memory in ageing and AD. In contrast, emotion does not improve their performances in the binding condition. Copyright © 2011 Elsevier Inc. All rights reserved.
Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P < .01), they also overrated incongruent emotions (P < .001), resulting in confusion of facial emotions. In contrast, MDD patients overrated congruent negative facial emotions (P < .001), but not incongruent facial emotions. Accordingly, ratings of congruent and incongruent emotions highly discriminated between bvFTD and MDD patients, ranging from area under the curve (AUC) = 93% to AUC = 98%. Further, an almost complete discrimination (AUC = 99%) was achieved by contrasting the 2 rating types. In contrast, Alzheimer's disease dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.
ERIC Educational Resources Information Center
Golan, Ofer; Baron-Cohen, Simon; Hill, Jacqueline J.; Rutherford, M. D.
2007-01-01
This study reports a revised version of the "Reading the Mind in the Voice" (RMV) task. The original task (Rutherford et al., (2002), "Journal of Autism and Developmental Disorders, 32," 189-194) suffered from ceiling effects and limited sensitivity. To improve that, the task was shortened and two more foils were added to each of the remaining…
Lee, Hannah; Kim, Jejoong
2017-06-01
It has been reported that visual perception can be influenced not only by the physical features of a stimulus but also by the emotional valence of the stimulus, even without explicit emotion recognition. Some previous studies reported an anger superiority effect while others found a happiness superiority effect during visual perception. It thus remains unclear as to which emotion is more influential. In the present study, we conducted two experiments using biological motion (BM) stimuli to examine whether emotional valence of the stimuli would affect BM perception; and if so, whether a specific type of emotion is associated with a superiority effect. Point-light walkers with three emotion types (anger, happiness, and neutral) were used, and the threshold to detect BM within noise was measured in Experiment 1. Participants showed higher performance in detecting happy walkers compared with the angry and neutral walkers. Follow-up motion velocity analysis revealed that physical difference among the stimuli was not the main factor causing the effect. The results of the emotion recognition task in Experiment 2 also showed a happiness superiority effect, as in Experiment 1. These results show that emotional valence (happiness) of the stimuli can facilitate the processing of BM.
Facial Recognition of Happiness Is Impaired in Musicians with High Music Performance Anxiety.
Sabino, Alini Daniéli Viana; Camargo, Cristielli M; Chagas, Marcos Hortes N; Osório, Flávia L
2018-01-01
Music performance anxiety (MPA) can be defined as a lasting and intense apprehension connected with musical performance in public. Studies suggest that MPA can be regarded as a subtype of social anxiety. Since individuals with social anxiety have deficits in the recognition of facial emotion, we hypothesized that musicians with high levels of MPA would share similar impairments. The aim of this study was to compare parameters of facial emotion recognition (FER) between musicians with high and low MPA. 150 amateur and professional musicians with different musical backgrounds were assessed in respect to their level of MPA and completed a dynamic FER task. The outcomes investigated were accuracy, response time, emotional intensity, and response bias. Musicians with high MPA were less accurate in the recognition of happiness ( p = 0.04; d = 0.34), had increased response bias toward fear ( p = 0.03), and increased response time to facial emotions as a whole ( p = 0.02; d = 0.39). Musicians with high MPA displayed FER deficits that were independent of general anxiety levels and possibly of general cognitive capacity. These deficits may favor the maintenance and exacerbation of experiences of anxiety during public performance, since cues of approval, satisfaction, and encouragement are not adequately recognized.
Emotion words and categories: evidence from lexical decision.
Scott, Graham G; O'Donnell, Patrick J; Sereno, Sara C
2014-05-01
We examined the categorical nature of emotion word recognition. Positive, negative, and neutral words were presented in lexical decision tasks. Word frequency was additionally manipulated. In Experiment 1, "positive" and "negative" categories of words were implicitly indicated by the blocked design employed. A significant emotion-frequency interaction was obtained, replicating past research. While positive words consistently elicited faster responses than neutral words, only low frequency negative words demonstrated a similar advantage. In Experiments 2a and 2b, explicit categories ("positive," "negative," and "household" items) were specified to participants. Positive words again elicited faster responses than did neutral words. Responses to negative words, however, were no different than those to neutral words, regardless of their frequency. The overall pattern of effects indicates that positive words are always facilitated, frequency plays a greater role in the recognition of negative words, and a "negative" category represents a somewhat disparate set of emotions. These results support the notion that emotion word processing may be moderated by distinct systems.
Unsupervised learning of facial emotion decoding skills.
Huelle, Jan O; Sack, Benjamin; Broer, Katja; Komlewa, Irina; Anders, Silke
2014-01-01
Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practice without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear, and sadness) was shown in each clip. Although no external information about the correctness of the participant's response or the sender's true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple stimuli described in previous studies and practice effects often observed in cognitive tasks.
Unsupervised learning of facial emotion decoding skills
Huelle, Jan O.; Sack, Benjamin; Broer, Katja; Komlewa, Irina; Anders, Silke
2013-01-01
Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practice without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear, and sadness) was shown in each clip. Although no external information about the correctness of the participant’s response or the sender’s true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple visual stimuli described in previous studies and practice effects often observed in cognitive tasks. PMID:24578686
Catalan, Ana; Gonzalez de Artaza, Maider; Bustamante, Sonia; Orgaz, Pablo; Osa, Luis; Angosto, Virxinia; Valverde, Cristina; Bilbao, Amaia; Madrazo, Arantza; van Os, Jim; Gonzalez-Torres, Miguel Angel
2016-01-01
Facial emotion recognition (FER) is essential to guide social functioning and behaviour for interpersonal communication. FER may be altered in severe mental illness such as in psychosis and in borderline personality disorder patients. However, it is unclear if these FER alterations are specifically related to psychosis. Awareness of FER alterations may be useful in clinical settings to improve treatment strategies. The aim of our study was to examine FER in patients with severe mental disorder and their relation with psychotic symptomatology. Socio-demographic and clinical variables were collected. Alterations on emotion recognition were assessed in 3 groups: patients with first episode psychosis (FEP) (n = 64), borderline personality patients (BPD) (n = 37) and healthy controls (n = 137), using the Degraded Facial Affect Recognition Task. The Positive and Negative Syndrome Scale, Structured Interview for Schizotypy Revised and Community Assessment of Psychic Experiences scales were used to assess positive psychotic symptoms. WAIS III subtests were used to assess IQ. Kruskal-Wallis analysis showed a significant difference between groups on the FER of neutral faces score between FEP, BPD patients and controls and between FEP patients and controls in angry face recognition. No significant differences were found between groups in the fear or happy conditions. There was a significant difference between groups in the attribution of negative emotion to happy faces. BPD and FEP groups had a much higher tendency to recognize happy faces as negatives. There was no association with the different symptom domains in either group. FEP and BPD patients have problems in recognizing neutral faces more frequently than controls. Moreover, patients tend to over-report negative emotions in recognition of happy faces. Although no relation between psychotic symptoms and FER alterations was found, these deficits could contribute to a patient's misinterpretations in daily life.
van Bokhorst, Lindsey G; Knapová, Lenka; Majoranc, Kim; Szebeni, Zea K; Táborský, Adam; Tomić, Dragana; Cañadas, Elena
2016-01-01
In many sports, such as figure skating or gymnastics, the outcome of a performance does not rely exclusively on objective measurements, but on more subjective cues. Judges need high attentional capacities to process visual information and overcome fatigue. Also their emotion recognition abilities might have an effect in detecting errors and making a more accurate assessment. Moreover, the scoring given by judges could be also influenced by their level of expertise. This study aims to assess how rhythmic gymnastics judges' emotion recognition and attentional abilities influence accuracy of performance assessment. Data will be collected from rhythmic gymnastics judges and coaches at different international levels. This study will employ an online questionnaire consisting on an emotion recognition test and attentional test. Participants' task is to watch a set of videotaped rhythmic gymnastics performances and evaluate them on the artistic and execution components of performance. Their scoring will be compared with the official scores given at the competition the video was taken from to measure the accuracy of the participants' evaluations. The proposed research represents an interdisciplinary approach that integrates cognitive and sport psychology within experimental and applied contexts. The current study advances the theoretical understanding of how emotional and attentional aspects affect the evaluation of sport performance. The results will provide valuable evidence on the direction and strength of the relationship between the above-mentioned factors and the accuracy of sport performance evaluation. Importantly, practical implications might be drawn from this study. Intervention programs directed at improving the accuracy of judges could be created based on the understanding of how emotion recognition and attentional abilities are related to the accuracy of performance assessment.
van Bokhorst, Lindsey G.; Knapová, Lenka; Majoranc, Kim; Szebeni, Zea K.; Táborský, Adam; Tomić, Dragana; Cañadas, Elena
2016-01-01
In many sports, such as figure skating or gymnastics, the outcome of a performance does not rely exclusively on objective measurements, but on more subjective cues. Judges need high attentional capacities to process visual information and overcome fatigue. Also their emotion recognition abilities might have an effect in detecting errors and making a more accurate assessment. Moreover, the scoring given by judges could be also influenced by their level of expertise. This study aims to assess how rhythmic gymnastics judges’ emotion recognition and attentional abilities influence accuracy of performance assessment. Data will be collected from rhythmic gymnastics judges and coaches at different international levels. This study will employ an online questionnaire consisting on an emotion recognition test and attentional test. Participants’ task is to watch a set of videotaped rhythmic gymnastics performances and evaluate them on the artistic and execution components of performance. Their scoring will be compared with the official scores given at the competition the video was taken from to measure the accuracy of the participants’ evaluations. The proposed research represents an interdisciplinary approach that integrates cognitive and sport psychology within experimental and applied contexts. The current study advances the theoretical understanding of how emotional and attentional aspects affect the evaluation of sport performance. The results will provide valuable evidence on the direction and strength of the relationship between the above-mentioned factors and the accuracy of sport performance evaluation. Importantly, practical implications might be drawn from this study. Intervention programs directed at improving the accuracy of judges could be created based on the understanding of how emotion recognition and attentional abilities are related to the accuracy of performance assessment. PMID:27458406
Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; Vinogradov, Sophia
2012-08-01
Cognitive remediation training has been shown to improve both cognitive and social cognitive deficits in people with schizophrenia, but the mechanisms that support this behavioral improvement are largely unknown. One hypothesis is that intensive behavioral training in cognition and/or social cognition restores the underlying neural mechanisms that support targeted skills. However, there is little research on the neural effects of cognitive remediation training. This study investigated whether a 50 h (10-week) remediation intervention which included both cognitive and social cognitive training would influence neural function in regions that support social cognition. Twenty-two stable, outpatient schizophrenia participants were randomized to a treatment condition consisting of auditory-based cognitive training (AT) [Brain Fitness Program/auditory module ~60 min/day] plus social cognition training (SCT) which was focused on emotion recognition [~5-15 min per day] or a placebo condition of non-specific computer games (CG) for an equal amount of time. Pre and post intervention assessments included an fMRI task of positive and negative facial emotion recognition, and standard behavioral assessments of cognition, emotion processing, and functional outcome. There were no significant intervention-related improvements in general cognition or functional outcome. fMRI results showed the predicted group-by-time interaction. Specifically, in comparison to CG, AT+SCT participants had a greater pre-to-post intervention increase in postcentral gyrus activity during emotion recognition of both positive and negative emotions. Furthermore, among all participants, the increase in postcentral gyrus activity predicted behavioral improvement on a standardized test of emotion processing (MSCEIT: Perceiving Emotions). Results indicate that combined cognition and social cognition training impacts neural mechanisms that support social cognition skills. Copyright © 2012 Elsevier B.V. All rights reserved.
Hooker, Christine I.; Bruce, Lori; Fisher, Melissa; Verosky, Sara C.; Miyakawa, Asako; Vinogradov, Sophia
2012-01-01
Cognitive remediation training has been shown to improve both cognitive and social-cognitive deficits in people with schizophrenia, but the mechanisms that support this behavioral improvement are largely unknown. One hypothesis is that intensive behavioral training in cognition and/or social-cognition restores the underlying neural mechanisms that support targeted skills. However, there is little research on the neural effects of cognitive remediation training. This study investigated whether a 50 hour (10-week) remediation intervention which included both cognitive and social-cognitive training would influence neural function in regions that support social-cognition. Twenty-two stable, outpatient schizophrenia participants were randomized to a treatment condition consisting of auditory-based cognitive training (AT) [Brain Fitness Program/auditory module ~60 minutes/day] plus social-cognition training (SCT) which was focused on emotion recognition [~5–15 minutes per day] or a placebo condition of non-specific computer games (CG) for an equal amount of time. Pre and post intervention assessments included an fMRI task of positive and negative facial emotion recognition, and standard behavioral assessments of cognition, emotion processing, and functional outcome. There were no significant intervention-related improvements in general cognition or functional outcome. FMRI results showed the predicted group-by-time interaction. Specifically, in comparison to CG, AT+SCT participants had a greater pre-to-post intervention increase in postcentral gyrus activity during emotion recognition of both positive and negative emotions. Furthermore, among all participants, the increase in postcentral gyrus activity predicted behavioral improvement on a standardized test of emotion processing (MSCEIT: Perceiving Emotions). Results indicate that combined cognition and social-cognition training impacts neural mechanisms that support social-cognition skills. PMID:22695257
Face emotion recognition is related to individual differences in psychosis-proneness.
Germine, L T; Hooker, C I
2011-05-01
Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.
Liu, Xinyang; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Cai, Xinxia; Wilhelm, Oliver
2017-01-01
Facial identity and facial expression processing are crucial socio-emotional abilities but seem to show only limited psychometric uniqueness when the processing speed is considered in easy tasks. We applied a comprehensive measurement of processing speed and contrasted performance specificity in socio-emotional, social and non-social stimuli from an individual differences perspective. Performance in a multivariate task battery could be best modeled by a general speed factor and a first-order factor capturing some specific variance due to processing emotional facial expressions. We further tested equivalence of the relationships between speed factors and polymorphisms of dopamine and serotonin transporter genes. Results show that the speed factors are not only psychometrically equivalent but invariant in their relation with the Catechol-O-Methyl-Transferase (COMT) Val158Met polymorphism. However, the 5-HTTLPR/rs25531 serotonin polymorphism was related with the first-order factor of emotion perception speed, suggesting a specific genetic correlate of processing emotions. We further investigated the relationship between several components of event-related brain potentials with psychometric abilities, and tested emotion specific individual differences at the neurophysiological level. Results revealed swifter emotion perception abilities to go along with larger amplitudes of the P100 and the Early Posterior Negativity (EPN), when emotion processing was modeled on its own. However, after partialling out the shared variance of emotion perception speed with general processing speed-related abilities, brain-behavior relationships did not remain specific for emotion. Together, the present results suggest that speed abilities are strongly interrelated but show some specificity for emotion processing speed at the psychometric level. At both genetic and neurophysiological levels, emotion specificity depended on whether general cognition is taken into account or not. These findings keenly suggest that general speed abilities should be taken into account when the study of emotion recognition abilities is targeted in its specificity. PMID:28848411
Speaker recognition with temporal cues in acoustic and electric hearing
NASA Astrophysics Data System (ADS)
Vongphoe, Michael; Zeng, Fan-Gang
2005-08-01
Natural spoken language processing includes not only speech recognition but also identification of the speaker's gender, age, emotional, and social status. Our purpose in this study is to evaluate whether temporal cues are sufficient to support both speech and speaker recognition. Ten cochlear-implant and six normal-hearing subjects were presented with vowel tokens spoken by three men, three women, two boys, and two girls. In one condition, the subject was asked to recognize the vowel. In the other condition, the subject was asked to identify the speaker. Extensive training was provided for the speaker recognition task. Normal-hearing subjects achieved nearly perfect performance in both tasks. Cochlear-implant subjects achieved good performance in vowel recognition but poor performance in speaker recognition. The level of the cochlear implant performance was functionally equivalent to normal performance with eight spectral bands for vowel recognition but only to one band for speaker recognition. These results show a disassociation between speech and speaker recognition with primarily temporal cues, highlighting the limitation of current speech processing strategies in cochlear implants. Several methods, including explicit encoding of fundamental frequency and frequency modulation, are proposed to improve speaker recognition for current cochlear implant users.
Frank, R; Schulze, L; Hellweg, R; Koehne, S; Roepke, S
2018-05-01
Although deficits in the recognition of emotional facial expressions are considered a hallmark of autism spectrum disorder (ASD), characterization of abnormalities in the differentiation of emotional expressions (e.g., sad vs. angry) has been rather inconsistent, especially in adults without intellectual impairments who may compensate for their deficits. In addition, previous research neglected the ability to detect emotional expressions (e.g., angry vs. neutral). The present study used a backward masking paradigm to investigate, a) the detection of emotional expressions, and b) the differentiation of emotional expressions in adults diagnosed with high functioning autism or Asperger syndrome (n = 23) compared to neurotypical controls (n = 25). Compensatory strategies were prevented by shortening the stimulus presentation time (33, 67, and 100 ms). In general, participants with ASD were significantly less accurate in detecting and differentiating emotional expressions compared to the control group. In the emotion differentiation task, individuals with ASD profited significantly less from an increase in presentation time. These results reinforce theoretical models that individuals with ASD have deficits in emotion recognition under time constraints. Furthermore, first evidence was provided that emotion detection and emotion differentiation are impaired in ASD. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Role for REM Sleep in Recalibrating the Sensitivity of the Human Brain to Specific Emotions
Gujar, Ninad; McDonald, Steven Andrew; Nishida, Masaki
2011-01-01
Although the impact of sleep on cognitive function is increasingly well established, the role of sleep in modulating affective brain processes remains largely uncharacterized. Using a face recognition task, here we demonstrate an amplified reactivity to anger and fear emotions across the day, without sleep. However, an intervening nap blocked and even reversed this negative emotional reactivity to anger and fear while conversely enhancing ratings of positive (happy) expressions. Most interestingly, only those subjects who obtained rapid eye movement (REM) sleep displayed this remodulation of affective reactivity for the latter 2 emotion categories. Together, these results suggest that the evaluation of specific human emotions is not static across a daytime waking interval, showing a progressive reactivity toward threat-related negative expressions. However, an episode of sleep can reverse this predisposition, with REM sleep depotentiating negative reactivity toward fearful expressions while concomitantly facilitating recognition and ratings of reward-relevant positive expressions. These findings support the view that sleep, and specifically REM neurophysiology, may represent an important factor governing the optimal homeostasis of emotional brain regulation. PMID:20421251
Palmer, Clare E; Langbehn, Douglas; Tabrizi, Sarah J; Papoutsi, Marina
2017-01-01
Cognitive impairment is common amongst many neurodegenerative movement disorders such as Huntington's disease (HD) and Parkinson's disease (PD) across multiple domains. There are many tasks available to assess different aspects of this dysfunction, however, it is imperative that these show high test-retest reliability if they are to be used to track disease progression or response to treatment in patient populations. Moreover, in order to ensure effects of practice across testing sessions are not misconstrued as clinical improvement in clinical trials, tasks which are particularly vulnerable to practice effects need to be highlighted. In this study we evaluated test-retest reliability in mean performance across three testing sessions of four tasks that are commonly used to measure cognitive dysfunction associated with striatal impairment: a combined Simon Stop-Signal Task; a modified emotion recognition task; a circle tracing task; and the trail making task. Practice effects were seen between sessions 1 and 2 across all tasks for the majority of dependent variables, particularly reaction time variables; some, but not all, diminished in the third session. Good test-retest reliability across all sessions was seen for the emotion recognition, circle tracing, and trail making test. The Simon interference effect and stop-signal reaction time (SSRT) from the combined-Simon-Stop-Signal task showed moderate test-retest reliability, however, the combined SSRT interference effect showed poor test-retest reliability. Our results emphasize the need to use control groups when tracking clinical progression or use pre-baseline training on tasks susceptible to practice effects.
Social Cognition Psychometric Evaluation: Results of the Final Validation Study.
Pinkham, Amy E; Harvey, Philip D; Penn, David L
2018-06-06
Social cognition is increasingly recognized as an important treatment target in schizophrenia; however, the dearth of well-validated measures that are suitable for use in clinical trials remains a significant limitation. The Social Cognition Psychometric Evaluation (SCOPE) study addresses this need by systematically evaluating the psychometric properties of promising measures. In this final phase of SCOPE, eight new or modified tasks were evaluated. Stable outpatients with schizophrenia (n = 218) and healthy controls (n = 154) completed the battery at baseline and 2-4 weeks later across three sites. Tasks included the Bell Lysaker Emotion Recognition Task (BLERT), Penn Emotion Recognition Task (ER-40), Reading the Mind in the Eyes Task (Eyes), The Awareness of Social Inferences Test (TASIT), Hinting Task, Mini Profile of Nonverbal Sensitivity (MiniPONS), Social Attribution Task-Multiple Choice (SAT-MC), and Intentionality Bias Task (IBT). BLERT and ER-40 modifications included response time and confidence ratings. The Eyes task was modified to include definitions of terms and TASIT to include response time. Hinting was scored with more stringent criteria. MiniPONS, SAT-MC, and IBT were new to this phase. Tasks were evaluated on (1) test-retest reliability, (2) utility as a repeated measure, (3) relationship to functional outcome, (4) practicality and tolerability, (5) sensitivity to group differences, and (6) internal consistency. Hinting, BLERT, and ER-40 showed the strongest psychometric properties and are recommended for use in clinical trials. Eyes, TASIT, and IBT showed somewhat weaker psychometric properties and require further study. MiniPONS and SAT-MC showed poorer psychometric properties that suggest caution for their use in clinical trials.
Kelly, Brian; Maguire-Herring, Vanessa; Rose, Christian M; Gore, Heather E; Ferrigno, Stephen; Novak, Melinda A; Lacreuse, Agnès
2014-11-01
Human aging is characterized by declines in cognition and fine motor function as well as improved emotional regulation. In men, declining levels of testosterone (T) with age have been implicated in the development of these age-related changes. However, studies examining the effects of T replacement on cognition, emotion and fine motor function in older men have not provided consistent results. Rhesus monkeys (Macaca mulatta) are excellent models for human cognitive aging and may provide novel insights on this issue. We tested 10 aged intact male rhesus monkeys (mean age=19, range 15-25) on a battery of cognitive, motor and emotional tasks at baseline and under low or high T experimental conditions. Their performance was compared to that of 6 young males previously tested in the same paradigm (Lacreuse et al., 2009; Lacreuse et al., 2010). Following a 4-week baseline testing period, monkeys were treated with a gonadotropin releasing hormone agonist (Depot Lupron, 200 μg/kg) to suppress endogenous T and were tested on the task battery under a 4-week high T condition (injection of Lupron+T enanthate, 20 mg/kg, n=8) or 4-week low T condition (injection of Lupron+oil vehicle, n=8) before crossing over to the opposite treatment. The cognitive tasks consisted of the Delayed Non-Matching-to-Sample (DNMS), the Delayed Response (DR), and the Delayed Recognition Span Test (spatial-DRST). The emotional tasks included an object Approach-Avoidance task and a task in which monkeys were played videos of unfamiliar conspecifics in different emotional context (Social Playbacks). The fine motor task was the Lifesaver task that required monkeys to remove a Lifesaver candy from rods of different complexity. T manipulations did not significantly affect visual recognition memory, working memory, reference memory or fine motor function at any age. In the Approach-Avoidance task, older monkeys, but not younger monkeys, spent more time in proximity of novel objects in the high T condition relative to the low T condition. In both age groups, high T increased watching time of threatening social stimuli in the Social Playbacks. These results suggest that T affects some aspects of emotional processing but has no effect on fine motor function or cognition in young or older male macaques. It is possible that the duration of T treatment was not long enough to affect cognition or fine motor function or that T levels were too high to improve these outcomes. An alternative explanation for the discrepancies of our findings with some of the cognitive and emotional effects of T reported in rodents and humans may be the use of a chemical castration, which reduced circulating gonadotropins in addition to T. Further studies will investigate whether the luteinizing hormone LH mediates the effects of T on brain function in male primates. Copyright © 2014 Elsevier Inc. All rights reserved.
Congruent bodily arousal promotes the constructive recognition of emotional words.
Kever, Anne; Grynberg, Delphine; Vermeulen, Nicolas
2017-08-01
Considerable research has shown that bodily states shape affect and cognition. Here, we examined whether transient states of bodily arousal influence the categorization speed of high arousal, low arousal, and neutral words. Participants realized two blocks of a constructive recognition task, once after a cycling session (increased arousal), and once after a relaxation session (reduced arousal). Results revealed overall faster response times for high arousal compared to low arousal words, and for positive compared to negative words. Importantly, low arousal words were categorized significantly faster after the relaxation than after the cycling, suggesting that a decrease in bodily arousal promotes the recognition of stimuli matching one's current arousal state. These findings highlight the importance of the arousal dimension in emotional processing, and suggest the presence of arousal-congruency effects. Copyright © 2017 Elsevier Inc. All rights reserved.
Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte
2018-04-01
It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and rely more on the facial cues of audio-visual emotional stimuli. Two groups of young adult CD CI users-early signers (ES CI users; n = 11) and late signers (LS CI users; n = 10)-and a group of hearing, non-signing, age-matched controls (n = 12) performed an emotion recognition task with auditory, visual, and cross-modal emotionally congruent and incongruent speech stimuli. On different trials, participants categorized either the facial or the vocal expressions. The ES CI users more accurately recognized affective prosody than the LS CI users in the presence of congruent facial information. Furthermore, the ES CI users, but not the LS CI users, gained more than the controls from congruent visual stimuli when recognizing affective prosody. Both CI groups performed overall worse than the controls in recognizing affective prosody. These results suggest that early sign language experience affects multisensory emotion perception in CD CI users.
Culture but not gender modulates amygdala activation during explicit emotion recognition.
Derntl, Birgit; Habel, Ute; Robinson, Simon; Windischberger, Christian; Kryspin-Exner, Ilse; Gur, Ruben C; Moser, Ewald
2012-05-29
Mounting evidence indicates that humans have significant difficulties in understanding emotional expressions from individuals of different ethnic backgrounds, leading to reduced recognition accuracy and stronger amygdala activation. However, the impact of gender on the behavioral and neural reactions during the initial phase of cultural assimilation has not been addressed. Therefore, we investigated 24 Asians students (12 females) and 24 age-matched European students (12 females) during an explicit emotion recognition task, using Caucasian facial expressions only, on a high-field MRI scanner. Analysis of functional data revealed bilateral amygdala activation to emotional expressions in Asian and European subjects. However, in the Asian sample, a stronger response of the amygdala emerged and was paralleled by reduced recognition accuracy, particularly for angry male faces. Moreover, no significant gender difference emerged. We also observed a significant inverse correlation between duration of stay and amygdala activation. In this study we investigated the "alien-effect" as an initial problem during cultural assimilation and examined this effect on a behavioral and neural level. This study has revealed bilateral amygdala activation to emotional expressions in Asian and European females and males. In the Asian sample, a stronger response of the amygdala bilaterally was observed and this was paralleled by reduced performance, especially for anger and disgust depicted by male expressions. However, no gender difference occurred. Taken together, while gender exerts only a subtle effect, culture and duration of stay as well as gender of poser are shown to be relevant factors for emotion processing, influencing not only behavioral but also neural responses in female and male immigrants.
Culture but not gender modulates amygdala activation during explicit emotion recognition
2012-01-01
Background Mounting evidence indicates that humans have significant difficulties in understanding emotional expressions from individuals of different ethnic backgrounds, leading to reduced recognition accuracy and stronger amygdala activation. However, the impact of gender on the behavioral and neural reactions during the initial phase of cultural assimilation has not been addressed. Therefore, we investigated 24 Asians students (12 females) and 24 age-matched European students (12 females) during an explicit emotion recognition task, using Caucasian facial expressions only, on a high-field MRI scanner. Results Analysis of functional data revealed bilateral amygdala activation to emotional expressions in Asian and European subjects. However, in the Asian sample, a stronger response of the amygdala emerged and was paralleled by reduced recognition accuracy, particularly for angry male faces. Moreover, no significant gender difference emerged. We also observed a significant inverse correlation between duration of stay and amygdala activation. Conclusion In this study we investigated the “alien-effect” as an initial problem during cultural assimilation and examined this effect on a behavioral and neural level. This study has revealed bilateral amygdala activation to emotional expressions in Asian and European females and males. In the Asian sample, a stronger response of the amygdala bilaterally was observed and this was paralleled by reduced performance, especially for anger and disgust depicted by male expressions. However, no gender difference occurred. Taken together, while gender exerts only a subtle effect, culture and duration of stay as well as gender of poser are shown to be relevant factors for emotion processing, influencing not only behavioral but also neural responses in female and male immigrants. PMID:22642400
Münkler, Paula; Rothkirch, Marcus; Dalati, Yasmin; Schmack, Katharina; Sterzer, Philipp
2015-01-01
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.
Deficits in facial affect recognition among antisocial populations: a meta-analysis.
Marsh, Abigail A; Blair, R J R
2008-01-01
Individuals with disorders marked by antisocial behavior frequently show deficits in recognizing displays of facial affect. Antisociality may be associated with specific deficits in identifying fearful expressions, which would implicate dysfunction in neural structures that subserve fearful expression processing. A meta-analysis of 20 studies was conducted to assess: (a) if antisocial populations show any consistent deficits in recognizing six emotional expressions; (b) beyond any generalized impairment, whether specific fear recognition deficits are apparent; and (c) if deficits in fear recognition are a function of task difficulty. Results show a robust link between antisocial behavior and specific deficits in recognizing fearful expressions. This impairment cannot be attributed solely to task difficulty. These results suggest dysfunction among antisocial individuals in specified neural substrates, namely the amygdala, involved in processing fearful facial affect.
Santaniello, G; Ferré, P; Rodríguez-Gómez, P; Poch, C; Eva, M Moreno; Hinojosa, J A
2018-06-15
Evidence from prior studies has shown an advantage in recognition memory for emotional compared to neutral words. Whether this advantage is short-lived or rather extends over longer periods, as well as whether the effect depends on words' valence (i.e., positive or negative), remains unknown. In the present ERP/EEG study, we investigated this issue by manipulating the lag distance (LAG-2, LAG-8 and LAG-16) between the presentation of old and new words in an online recognition memory task. LAG differences were observed at behavior, ERPs and in the theta frequency band. In line with previous studies, negative words were associated with faster reaction times, higher hit rates and increased amplitude in a positive ERP component between 386 and 564 ms compared to positive and neutral words. Remarkably, the interaction of LAG by EMOTION revealed that negative words were associated with better performance and larger ERPs amplitudes only at LAG-2. Also in the LAG-2 condition, emotional words (i.e., positive and negative words) induced a stronger desynchronization in the beta band between 386 and 542 ms compared to neutral words. These early enhanced memory effects for emotional words are discussed in terms of the Negative Emotional Valence Enhances Recapitulation (NEVER) model and the mobilization-minimization hypothesis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Hattingh, Coenraad J.; Ipser, J.; Tromp, S. A.; Syal, S.; Lochner, C.; Brooks, S. J.; Stein, D. J.
2012-01-01
Background: Social anxiety disorder (SAD) is characterized by abnormal fear and anxiety in social situations. Functional magnetic resonance imaging (fMRI) is a brain imaging technique that can be used to demonstrate neural activation to emotionally salient stimuli. However, no attempt has yet been made to statistically collate fMRI studies of brain activation, using the activation likelihood-estimate (ALE) technique, in response to emotion recognition tasks in individuals with SAD. Methods: A systematic search of fMRI studies of neural responses to socially emotive cues in SAD was undertaken. ALE meta-analysis, a voxel-based meta-analytic technique, was used to estimate the most significant activations during emotional recognition. Results: Seven studies were eligible for inclusion in the meta-analysis, constituting a total of 91 subjects with SAD, and 93 healthy controls. The most significant areas of activation during emotional vs. neutral stimuli in individuals with SAD compared to controls were: bilateral amygdala, left medial temporal lobe encompassing the entorhinal cortex, left medial aspect of the inferior temporal lobe encompassing perirhinal cortex and parahippocampus, right anterior cingulate, right globus pallidus, and distal tip of right postcentral gyrus. Conclusion: The results are consistent with neuroanatomic models of the role of the amygdala in fear conditioning, and the importance of the limbic circuitry in mediating anxiety symptoms. PMID:23335892
Nishikawa, Saori; Toshima, Tamotsu; Kobayashi, Masao
2015-01-01
This study examined changes in prefrontal oxy-Hb levels measured by NIRS (Near-Infrared Spectroscopy) during a facial-emotion recognition task in healthy adults, testing a mediational/moderational model of these variables. Fifty-three healthy adults (male = 35, female = 18) aged between 22 to 37 years old (mean age = 24.05 years old) provided saliva samples, completed a EMBU questionnaire (Swedish acronym for Egna Minnen Beträffande Uppfostran [My memories of upbringing]), and participated in a facial-emotion recognition task during NIRS recording. There was a main effect of maternal rejection on RoxH (right frontal activation during an ambiguous task), and a gene × environment (G × E) interaction on RoxH, suggesting that individuals who carry the SL or LL genotype and who endorse greater perceived maternal rejection show less right frontal activation than SL/LL carriers with lower perceived maternal rejection. Finally, perceived parenting style played a mediating role in right frontal activation via the 5-HTTLPR genotype. Early-perceived parenting might influence neural activity in an uncertain situation i.e. rating ambiguous faces among individuals with certain genotypes. This preliminary study makes a small contribution to the mapping of an influence of gene and behaviour on the neural system. More such attempts should be made in order to clarify the links.
Nishikawa, Saori
2015-01-01
This study examined changes in prefrontal oxy-Hb levels measured by NIRS (Near-Infrared Spectroscopy) during a facial-emotion recognition task in healthy adults, testing a mediational/moderational model of these variables. Fifty-three healthy adults (male = 35, female = 18) aged between 22 to 37 years old (mean age = 24.05 years old) provided saliva samples, completed a EMBU questionnaire (Swedish acronym for Egna Minnen Beträffande Uppfostran [My memories of upbringing]), and participated in a facial-emotion recognition task during NIRS recording. There was a main effect of maternal rejection on RoxH (right frontal activation during an ambiguous task), and a gene × environment (G×E) interaction on RoxH, suggesting that individuals who carry the SL or LL genotype and who endorse greater perceived maternal rejection show less right frontal activation than SL/LL carriers with lower perceived maternal rejection. Finally, perceived parenting style played a mediating role in right frontal activation via the 5-HTTLPR genotype. Early-perceived parenting might influence neural activity in an uncertain situation i.e. rating ambiguous faces among individuals with certain genotypes. This preliminary study makes a small contribution to the mapping of an influence of gene and behaviour on the neural system. More such attempts should be made in order to clarify the links. PMID:26418317
Camalier, Corrie R; McHugo, Maureen; Zald, David H; Neimat, Joseph S
2018-01-01
In addition to motor symptoms, Parkinson's disease (PD) involves significant non-motor sequelae, including disruptions in cognitive and emotional processing. Fear recognition appears to be affected both by the course of the disease and by a common interventional therapy, deep brain stimulation of the subthalamic nucleus (STN-DBS). Here, we examined if these effects extend to other aspects of emotional processing, such as attentional capture by negative emotional stimuli. Performance on an emotional attentional blink (EAB) paradigm, a common paradigm used to study emotional capture of attention, was examined in a cohort of individuals with PD, both on and off STN-DBS therapy (n=20). To contrast effects of healthy aging and other movement disorder and DBS targets, we also examined performance in a healthy elderly (n=20) and young (n=18) sample on the same task, and a sample diagnosed with Essential Tremor (ET) undergoing therapeutic deep brain stimulation of the ventral-intermediate nucleus (VIM-DBS, n=18). All four groups showed a robust attentional capture of emotional stimuli, irrespective of aging processes, movement disorder diagnosis, or stimulation. PD patients on average had overall worse performance, but this decrement in performance was not related to the emotional capture of attention. PD patients exhibited a robust EAB, indicating that the ability of emotion to direct attention remains intact in PD. Congruent with other recent data, these findings suggest that fear recognition deficits in PD may instead reflect a highly specific problem in recognition, rather than a general deficit in emotional processing of fearful stimuli.
The visual discrimination of negative facial expressions by younger and older adults.
Mienaltowski, Andrew; Johnson, Ellen R; Wittman, Rebecca; Wilson, Anne-Taylor; Sturycz, Cassandra; Norman, J Farley
2013-04-05
Previous research has demonstrated that older adults are not as accurate as younger adults at perceiving negative emotions in facial expressions. These studies rely on emotion recognition tasks that involve choosing between many alternatives, creating the possibility that age differences emerge for cognitive rather than perceptual reasons. In the present study, an emotion discrimination task was used to investigate younger and older adults' ability to visually discriminate between negative emotional facial expressions (anger, sadness, fear, and disgust) at low (40%) and high (80%) expressive intensity. Participants completed trials blocked by pairs of emotions. Discrimination ability was quantified from the participants' responses using signal detection measures. In general, the results indicated that older adults had more difficulty discriminating between low intensity expressions of negative emotions than did younger adults. However, younger and older adults did not differ when discriminating between anger and sadness. These findings demonstrate that age differences in visual emotion discrimination emerge when signal detection measures are used but that these differences are not uniform and occur only in specific contexts.
Mothersill, David; Dillon, Rachael; Hargreaves, April; Castorina, Marco; Furey, Emilia; Fagan, Andrew J; Meaney, James F; Fitzmaurice, Brian; Hallahan, Brian; McDonald, Colm; Wykes, Til; Corvin, Aiden; Robertson, Ian H; Donohoe, Gary
2018-05-27
Working memory based cognitive remediation therapy (CT) for psychosis has recently been associated with broad improvements in performance on untrained tasks measuring working memory, episodic memory and IQ, and changes in associated brain regions. However, it is unclear if these improvements transfer to the domain of social cognition and neural activity related to performance on social cognitive tasks. We examined performance on the Reading the Mind in the Eyes test (Eyes test) in a large sample of participants with psychosis who underwent working memory based CT (N = 43) compared to a Control Group of participants with psychosis (N = 35). In a subset of this sample, we used functional magnetic resonance imaging (fMRI) to examine changes in neural activity during a facial emotion recognition task in participants who underwent CT (N = 15) compared to a Control Group (N = 15). No significant effects of CT were observed on Eyes test performance or on neural activity during facial emotion recognition, either at p<0.05 family-wise error, or at a p<0.001 uncorrected threshold, within a priori social cognitive regions of interest. This study suggests that working memory based CT does not significantly impact an aspect of social cognition which was measured behaviourally and neurally. It provides further evidence that deficits in the ability to decode mental state from facial expressions are dissociable from working memory deficits, and suggests that future CT programs should target social cognition in addition to working memory for the purposes of further enhancing social function. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Emotional memory and perception in temporal lobectomy patients with amygdala damage.
Brierley, B; Medford, N; Shaw, P; David, A S
2004-04-01
The human amygdala is implicated in the formation of emotional memories and the perception of emotional stimuli--particularly fear--across various modalities. To discern the extent to which these functions are related. 28 patients who had anterior temporal lobectomy (13 left and 15 right) for intractable epilepsy were recruited. Structural magnetic resonance imaging showed that three of them had atrophy of their remaining amygdala. All participants were given tests of affect perception from facial and vocal expressions and of emotional memory, using a standard narrative test and a novel test of word recognition. The results were standardised against matched healthy controls. Performance on all emotion tasks in patients with unilateral lobectomy ranged from unimpaired to moderately impaired. Perception of emotions in faces and voices was (with exceptions) significantly positively correlated, indicating multimodal emotional processing. However, there was no correlation between the subjects' performance on tests of emotional memory and perception. Several subjects showed strong emotional memory enhancement but poor fear perception. Patients with bilateral amygdala damage had greater impairment, particularly on the narrative test of emotional memory, one showing superior fear recognition but absent memory enhancement. Bilateral amygdala damage is particularly disruptive of emotional memory processes in comparison with unilateral temporal lobectomy. On a cognitive level, the pattern of results implies that perception of emotional expressions and emotional memory are supported by separate processing systems or streams.
Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia
Paulmann, Silke; Ott, Derek V. M.; Kotz, Sonja A.
2011-01-01
The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages. PMID:21437277
Hoaken, Peter N S; Allaby, David B; Earle, Jeff
2007-01-01
Violence is a social problem that carries enormous costs; however, our understanding of its etiology is quite limited. A large body of research exists, which suggests a relationship between abnormalities of the frontal lobe and aggression; as a result, many researchers have implicated deficits in so-called "executive function" as an antecedent to aggressive behaviour. Another possibility is that violence may be related to problems interpreting facial expressions of emotion, a deficit associated with many forms of psychopathology, and an ability linked to the prefrontal cortex. The current study investigated performance on measures of executive function and on a facial-affect recognition task in 20 violent offenders, 20 non-violent offenders, and 20 controls. In support of our hypotheses, both offender groups performed significantly more poorly on measures of executive function relative to controls. In addition, violent offenders were significantly poorer on the facial-affect recognition task than either of the other two groups. Interestingly, scores on these measures were significantly correlated, with executive deficits associated with difficulties accurately interpreting facial affect. The implications of these results are discussed in terms of a broader understanding of violent behaviour. Copyright 2007 Wiley-Liss, Inc.
Röder, Christian H; Mohr, Harald; Linden, David E J
2011-02-01
Faces are multidimensional stimuli that convey information for complex social and emotional functions. Separate neural systems have been implicated in the recognition of facial identity (mainly extrastriate visual cortex) and emotional expression (limbic areas and the superior temporal sulcus). Working-memory (WM) studies with faces have shown different but partly overlapping activation patterns in comparison to spatial WM in parietal and prefrontal areas. However, little is known about the neural representations of the different facial dimensions during WM. In the present study 22 subjects performed a face-identity or face-emotion WM task at different load levels during functional magnetic resonance imaging. We found a fronto-parietal-visual WM-network for both tasks during maintenance, including fusiform gyrus. Limbic areas in the amygdala and parahippocampal gyrus demonstrated a stronger activation for the identity than the emotion condition. One explanation for this finding is that the repetitive presentation of faces with different identities but the same emotional expression during the identity-task is responsible for the stronger increase in BOLD signal in the amygdala. These results raise the question how different emotional expressions are coded in WM. Our findings suggest that emotional expressions are re-coded in an abstract representation that is supported at the neural level by the canonical fronto-parietal WM network. Copyright © 2010 Elsevier Ltd. All rights reserved.
Moon, Chung-Man; Yang, Jong-Chul; Jeong, Gwang-Woo
2017-01-01
The functional neuroanatomy for explicit memory in conjunction with the major anxiety symptoms in patients with generalized anxiety disorder (GAD) has not yet been clearly identified. To investigate the brain activation patterns on the interaction between emotional and cognitive function during the explicit memory tasks, as well as its correlation with clinical characteristics in GAD. The participants comprised GAD patients and age-matched healthy controls. The fMR images were obtained while the participants performed an explicit memory task with neutral and anxiety-inducing words. Patients showed significantly decreased functional activities in the putamen, head of the caudate nucleus, hippocampus, and middle cingulate gyrus during the memory tasks with the neutral and anxiety-inducing words, whereas the precentral gyrus and ventrolateral prefrontal cortex were significantly increased only in the memory tasks with the anxiety-inducing words. Also, the blood oxygenation level-dependent (BOLD) signal changes in the hippocampus were positively correlated with the recognition accuracy for both neutral and anxiety-inducing words. This study identified the brain areas associated with the interaction between emotional regulation and cognitive function in the explicit memory tasks in patients with GAD. These findings would be helpful to understand the neural mechanism on the explicit memory-related cognitive deficits and emotional dysfunction with GAD symptoms. © The Foundation Acta Radiologica 2016.
Cox, Sharon; Bertoux, Maxime; Turner, John J D; Moss, Antony; Locker, Kirsty; Riggs, Kevin
2018-06-01
Alcohol Use Disorder (AUD) is associated with problems with processing complex social scenarios. Little is known about the relationship between distinct AUD-related factors (e.g., years of problematic drinking), aspects of cognitive function and dysfunction in individuals diagnosed with AUD, and the relative impact these may have on social cognition. To explore differences in social cognition between a group of participants diagnosed with AUD and controls, using a clinical measure, the Mini Social and Emotional Assessment (mini-SEA). The mini-SEA was used to evaluate social and emotional understanding through a facial emotional recognition task and by utilising a series of social scenes some of which contain a faux pas (social error). Eighty-five participants (individuals with AUD and controls) completed demographic questions and a general cognitive and social cognitive test battery over three consecutive days. Between group analyses revealed that the participants with AUD performed less well on the faux pas test, and differences were also revealed in the emotional facial recognition task. Years of problematic alcohol consumption was the strongest predictor of poor ToM reasoning. These results suggest a strong link between AUD chronicity and social cognition, though the direction of this relationship needs further elucidation. This may be of clinical relevance to abstinence and relapse management, as basic social cognition skills and ability to maintain interpersonal relationships are likely to be crucial to recovery. Copyright © 2018 Elsevier B.V. All rights reserved.
Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M
2015-02-01
Noradrenaline interacts with stress hormones in the amygdala and hippocampus to enhance emotional memory consolidation, but the noradrenergic-glucocorticoid interaction at retrieval, where stress impairs memory, is less understood. We used a genetic neuroimaging approach to investigate whether a genetic variation of the noradrenergic system impacts stress-induced neural activity in amygdala and hippocampus during recognition of emotional memory. This study is based on genotype-dependent reanalysis of data from our previous publication (Li et al. Brain Imaging Behav 2014). Twenty-two healthy male volunteers were genotyped for the ADRA2B gene encoding the α2B-adrenergic receptor. Ten deletion carriers and 12 noncarriers performed an emotional face recognition task, while their brain activity was measured with fMRI. During encoding, 50 fearful and 50 neutral faces were presented. One hour later, they underwent either an acute stress (Trier Social Stress Test) or a control procedure which was followed immediately by the retrieval session, where participants had to discriminate between 100 old and 50 new faces. A genotype-dependent modulation of neural activity at retrieval was found in the bilateral amygdala and right hippocampus. Deletion carriers showed decreased neural activity in the amygdala when recognizing emotional faces in control condition and increased amygdala activity under stress. Noncarriers showed no differences in emotional modulated amygdala activation under stress or control. Instead, stress-induced increases during recognition of emotional faces were present in the right hippocampus. The genotype-dependent effects of acute stress on neural activity in amygdala and hippocampus provide evidence for noradrenergic-glucocorticoid interaction in emotional memory retrieval.
The Automaticity of Emotional Face-Context Integration
Aviezer, Hillel; Dudarev, Veronica; Bentin, Shlomo; Hassin, Ran R.
2011-01-01
Recent studies have demonstrated that context can dramatically influence the recognition of basic facial expressions, yet the nature of this phenomenon is largely unknown. In the present paper we begin to characterize the underlying process of face-context integration. Specifically, we examine whether it is a relatively controlled or automatic process. In Experiment 1 participants were motivated and instructed to avoid using the context while categorizing contextualized facial expression, or they were led to believe that the context was irrelevant. Nevertheless, they were unable to disregard the context, which exerted a strong effect on their emotion recognition. In Experiment 2, participants categorized contextualized facial expressions while engaged in a concurrent working memory task. Despite the load, the context exerted a strong influence on their recognition of facial expressions. These results suggest that facial expressions and their body contexts are integrated in an unintentional, uncontrollable, and relatively effortless manner. PMID:21707150
Emotion regulation during the encoding of emotional stimuli: Effects on subsequent memory.
Leventon, Jacqueline S; Bauer, Patricia J
2016-02-01
In the adult literature, emotional arousal is regarded as a source of the enhancing effect of emotion on subsequent memory. Here, we used behavioral and electrophysiological methods to examine the role of emotional arousal on subsequent memory in school-age children. Furthermore, we implemented a reappraisal instruction to manipulate (down-regulate) emotional arousal at encoding to examine the relation between emotional arousal and subsequent memory. Participants (8-year-old girls) viewed emotional scenes as electrophysiological (EEG) data were recorded and participated in a memory task 1 to 5days later where EEG and behavioral responses were recorded; participants provided subjective ratings of the scenes after the memory task. The reappraisal instruction successfully reduced emotional arousal responses to negative stimuli but not positive stimuli. Similarly, recognition performance in both event-related potentials (ERPs) and behavior was impaired for reappraised negative stimuli but not positive stimuli. The findings indicate that ERPs are sensitive to the reappraisal of negative stimuli in children as young as 8years. Furthermore, the findings suggest an interaction of emotion and memory during the school years, implicating the explanatory role of emotional arousal at encoding on subsequent memory performance in female children as young as 8years. Copyright © 2015 Elsevier Inc. All rights reserved.
Scaling of Advanced Theory-of-Mind Tasks
ERIC Educational Resources Information Center
Osterhaus, Christopher; Koerber, Susanne; Sodian, Beate
2016-01-01
Advanced theory-of-mind (AToM) development was investigated in three separate studies involving 82, 466, and 402 elementary school children (8-, 9-, and 10-year-olds). Rasch and factor analyses assessed whether common conceptual development underlies higher-order false-belief understanding, social understanding, emotion recognition, and…
Cognitive mechanisms of diazepam administration: a healthy volunteer model of emotional processing.
Pringle, A; Warren, M; Gottwald, J; Cowen, P J; Harmer, C J
2016-06-01
Benzodiazepine drugs continue to be prescribed relatively frequently for anxiety disorders, especially where other treatments have failed or when rapid alleviation of anxiety is imperative. The neuropsychological mechanism by which these drugs act to relieve symptoms, however, remains underspecified. Cognitive accounts of anxiety disorders emphasise hypervigilance for threat in the maintenance of the disorders. The current study examined the effects of 7- or 8-day administration of diazepam in healthy participants (n = 36) on a well-validated battery of tasks measuring emotional processing, including measures of vigilance for threat and physiological responses to threat. Compared to placebo, diazepam reduced vigilant-avoidant patterns of emotional attention (p < 0.01) and reduced general startle responses (p < .05). Diazepam administration had limited effects on emotional processing, enhancing the response to positive vs negative words in the emotional categorisation task (p < .05), modulating emotional memory in terms of false accuracy (p < .05) and slowing the recognition of all facial expressions of emotion (p = .01). These results have implications for our understanding of the cognitive mechanisms of benzodiazepine treatment. The data reported here suggests that diazepam modulates emotional attention, an effect which may be involved in its therapeutic actions in anxiety.
Interoceptive sensitivity predicts sensitivity to the emotions of others.
Terasawa, Yuri; Moriguchi, Yoshiya; Tochizawa, Saiko; Umeda, Satoshi
2014-01-01
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.
Probiotics drive gut microbiome triggering emotional brain signatures.
Bagga, Deepika; Reichert, Johanna Louise; Koschutnig, Karl; Aigner, Christoph Stefan; Holzer, Peter; Koskinen, Kaisa; Eichinger, Christine Moissl; Schöpf, Veronika
2018-05-03
Experimental manipulation of the gut microbiome was found to modify emotional and cognitive behavior, neurotransmitter expression and brain function in rodents, but corresponding human data remain scarce. The present double-blind, placebo-controlled randomised study aimed at investigating the effects of 4 weeks' probiotic administration on behavior, brain function and gut microbial composition in healthy volunteers. Forty-five healthy participants divided equally into three groups (probiotic, placebo and no intervention) underwent functional MRI (emotional decision-making and emotional recognition memory tasks). In addition, stool samples were collected to investigate the gut microbial composition. Probiotic administration for 4 weeks was associated with changes in brain activation patterns in response to emotional memory and emotional decision-making tasks, which were also accompanied by subtle shifts in gut microbiome profile. Microbiome composition mirrored self-reported behavioral measures and memory performance. This is the first study reporting a distinct influence of probiotic administration at behavioral, neural, and microbiome levels at the same time in healthy volunteers. The findings provide a basis for future investigations into the role of the gut microbiota and potential therapeutic application of probiotics.
Theory of Mind Predicts Emotion Knowledge Development in Head Start Children
Seidenfeld, Adina M.; Johnson, Stacy R.; Cavadel, Elizabeth Woodburn; Izard, Carroll E.
2014-01-01
Research Findings Emotion knowledge (EK) enables children to identify emotions in themselves and others and its development facilitates emotion recognition in complex social situations. Social-cognitive processes, such as theory of mind (ToM), may contribute to developing EK by helping children realize the inherent variability associated with emotion expression across individuals and situations. The present study explored how ToM, particularly false belief understanding, in preschool predicts children’s developing EK in kindergarten. Participants were 60 3- to 5-year-old Head Start children. ToM and EK measures were obtained from standardized child tasks. ToM scores were positively related to performance on an EK task in kindergarten after controlling for preschool levels of EK and verbal ability. Exploratory analyses provided preliminary evidence that ToM serves as an indirect effect between verbal ability and EK. Practice or Policy Early intervention programs may benefit from including lessons on ToM to help promote socio-emotional learning, specifically EK. This consideration may be the most fruitful when the targeted population is at-risk. PMID:25364212
Perceptual and affective mechanisms in facial expression recognition: An integrative review.
Calvo, Manuel G; Nummenmaa, Lauri
2016-09-01
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.
Turner, Rose; Felisberti, Fatima M
2017-01-01
Mindreading refers to the ability to attribute mental states, including thoughts, intentions and emotions, to oneself and others, and is essential for navigating the social world. Empirical mindreading research has predominantly featured children, groups with autism spectrum disorder and clinical samples, and many standard tasks suffer ceiling effects with neurologically typical (NT) adults. We first outline a case for studying mindreading in NT adults and proceed to review tests of emotion perception, cognitive and affective mentalizing, and multidimensional tasks combining these facets. We focus on selected examples of core experimental paradigms including emotion recognition tests, social vignettes, narrative fiction (prose and film) and participative interaction (in real and virtual worlds), highlighting challenges for studies with NT adult cohorts. We conclude that naturalistic, multidimensional approaches may be productively applied alongside traditional tasks to facilitate a more nuanced picture of mindreading in adulthood, and to ensure construct validity whilst remaining sensitive to variation at the upper echelons of the ability.
Turner, Rose; Felisberti, Fatima M.
2017-01-01
Mindreading refers to the ability to attribute mental states, including thoughts, intentions and emotions, to oneself and others, and is essential for navigating the social world. Empirical mindreading research has predominantly featured children, groups with autism spectrum disorder and clinical samples, and many standard tasks suffer ceiling effects with neurologically typical (NT) adults. We first outline a case for studying mindreading in NT adults and proceed to review tests of emotion perception, cognitive and affective mentalizing, and multidimensional tasks combining these facets. We focus on selected examples of core experimental paradigms including emotion recognition tests, social vignettes, narrative fiction (prose and film) and participative interaction (in real and virtual worlds), highlighting challenges for studies with NT adult cohorts. We conclude that naturalistic, multidimensional approaches may be productively applied alongside traditional tasks to facilitate a more nuanced picture of mindreading in adulthood, and to ensure construct validity whilst remaining sensitive to variation at the upper echelons of the ability. PMID:28174552
2013-01-01
Background We have developed a new paradigm that targets the recognition of facial expression of emotions. Here we report the protocol of a randomised controlled trial of the effects of emotion recognition training on mood in a sample of individuals with depressive symptoms over a 6-week follow-up period. Methods/Design We will recruit 190 adults from the general population who report high levels of depressive symptoms (defined as a score ≥ 14 on the Beck Depression Inventory-II). Participants will attend a screening session and will be randomised to intervention or control procedures, repeated five times over consecutive days (Monday to Friday). A follow-up session will take place at end-of -treatment, 2-weeks and 6-weeks after training. Our primary study outcome will be depressive symptoms, Beck Depression Inventory- II (rated over the past two weeks). Our secondary outcomes are: depressive symptoms, Hamilton Rating Scale for Depression; anxiety symptoms, Beck Anxiety Inventory (rated over the past month); positive affect, Positive and Negative Affect Schedule (rated as ‘how you feel right now’); negative affect, Positive and Negative Affect Schedule (rated as ‘how you feel right now’); emotion sensitivity, Emotion Recognition Task (test phase); approach motivation and persistence, the Fishing Game; and depressive interpretation bias, Scrambled Sentences Test. Discussion This study is of a novel cognitive bias modification technique that targets biases in emotional processing characteristic of depression, and can be delivered automatically via computer, Internet or Smartphone. It therefore has potential to be a valuable cost-effective adjunctive treatment for depression which may be used together with more traditional psychotherapy, cognitive-behavioural therapy and pharmacotherapy. Trial registration Current Controlled Trials: ISRCTN17767674 PMID:23725208
Adams, Sally; Penton-Voak, Ian S; Harmer, Catherine J; Holmes, Emily A; Munafò, Marcus R
2013-06-01
We have developed a new paradigm that targets the recognition of facial expression of emotions. Here we report the protocol of a randomised controlled trial of the effects of emotion recognition training on mood in a sample of individuals with depressive symptoms over a 6-week follow-up period. We will recruit 190 adults from the general population who report high levels of depressive symptoms (defined as a score ≥ 14 on the Beck Depression Inventory-II). Participants will attend a screening session and will be randomised to intervention or control procedures, repeated five times over consecutive days (Monday to Friday). A follow-up session will take place at end-of -treatment, 2-weeks and 6-weeks after training. Our primary study outcome will be depressive symptoms, Beck Depression Inventory- II (rated over the past two weeks). Our secondary outcomes are: depressive symptoms, Hamilton Rating Scale for Depression; anxiety symptoms, Beck Anxiety Inventory (rated over the past month); positive affect, Positive and Negative Affect Schedule (rated as 'how you feel right now'); negative affect, Positive and Negative Affect Schedule (rated as 'how you feel right now'); emotion sensitivity, Emotion Recognition Task (test phase); approach motivation and persistence, the Fishing Game; and depressive interpretation bias, Scrambled Sentences Test. This study is of a novel cognitive bias modification technique that targets biases in emotional processing characteristic of depression, and can be delivered automatically via computer, Internet or Smartphone. It therefore has potential to be a valuable cost-effective adjunctive treatment for depression which may be used together with more traditional psychotherapy, cognitive-behavioural therapy and pharmacotherapy. Current Controlled Trials: ISRCTN17767674.
Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.
Nummenmaa, Lauri; Calvo, Manuel G
2015-04-01
Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).
Comparing Emotion Recognition Skills among Children with and without Jailed Parents.
Hindt, Lauren A; Davis, Laurel; Schubert, Erin C; Poehlmann-Tynan, Julie; Shlafer, Rebecca J
2016-01-01
Approximately five million children in the United States have experienced a co-resident parent's incarceration in jail or prison. Parental incarceration is associated with multiple risk factors for maladjustment, which may contribute to the increased likelihood of behavioral problems in this population. Few studies have examined early predictors of maladjustment among children with incarcerated parents, limiting scholars' understanding about potential points for prevention and intervention. Emotion recognition skills may play a role in the development of maladjustment and may be amenable to intervention. The current study examined whether emotion recognition skills differed between 3- to 8-year-old children with and without jailed parents. We hypothesized that children with jailed parents would have a negative bias in processing emotions and less accuracy compared to children without incarcerated parents. Data were drawn from 128 families, including 75 children (53.3% male, M = 5.37 years) with jailed parents and 53 children (39.6% male, M = 5.02 years) without jailed parents. Caregivers in both samples provided demographic information. Children performed an emotion recognition task in which they were asked to produce a label for photos expressing six different emotions (i.e., happy, surprised, neutral, sad, angry, and fearful). For scoring, the number of positive and negative labels were totaled; the number of negative labels provided for neutral and positive stimuli were totaled (measuring negative bias/overextension of negative labels); and valence accuracy (i.e., positive, negative, and neutral) and label accuracy were calculated. Results indicated a main effect of parental incarceration on the number of positive labels provided; children with jailed parents presented significantly fewer positive emotions than the comparison group. There was also a main effect of parental incarceration on negative bias (the overextension of negative labels); children with jailed parents had a negative bias compared to children without jailed parents. However, these findings did not hold when controlling for child age, race/ethnicity, receipt of special education services, and caregiver education. The results provide some evidence for the effect of the context of parental incarceration in the development of negative emotion recognition biases. Limitations and implications for future research and interventions are discussed.
Comparing Emotion Recognition Skills among Children with and without Jailed Parents
Hindt, Lauren A.; Davis, Laurel; Schubert, Erin C.; Poehlmann-Tynan, Julie; Shlafer, Rebecca J.
2016-01-01
Approximately five million children in the United States have experienced a co-resident parent’s incarceration in jail or prison. Parental incarceration is associated with multiple risk factors for maladjustment, which may contribute to the increased likelihood of behavioral problems in this population. Few studies have examined early predictors of maladjustment among children with incarcerated parents, limiting scholars’ understanding about potential points for prevention and intervention. Emotion recognition skills may play a role in the development of maladjustment and may be amenable to intervention. The current study examined whether emotion recognition skills differed between 3- to 8-year-old children with and without jailed parents. We hypothesized that children with jailed parents would have a negative bias in processing emotions and less accuracy compared to children without incarcerated parents. Data were drawn from 128 families, including 75 children (53.3% male, M = 5.37 years) with jailed parents and 53 children (39.6% male, M = 5.02 years) without jailed parents. Caregivers in both samples provided demographic information. Children performed an emotion recognition task in which they were asked to produce a label for photos expressing six different emotions (i.e., happy, surprised, neutral, sad, angry, and fearful). For scoring, the number of positive and negative labels were totaled; the number of negative labels provided for neutral and positive stimuli were totaled (measuring negative bias/overextension of negative labels); and valence accuracy (i.e., positive, negative, and neutral) and label accuracy were calculated. Results indicated a main effect of parental incarceration on the number of positive labels provided; children with jailed parents presented significantly fewer positive emotions than the comparison group. There was also a main effect of parental incarceration on negative bias (the overextension of negative labels); children with jailed parents had a negative bias compared to children without jailed parents. However, these findings did not hold when controlling for child age, race/ethnicity, receipt of special education services, and caregiver education. The results provide some evidence for the effect of the context of parental incarceration in the development of negative emotion recognition biases. Limitations and implications for future research and interventions are discussed. PMID:27504101
Gonzalez de Artaza, Maider; Bustamante, Sonia; Orgaz, Pablo; Osa, Luis; Angosto, Virxinia; Valverde, Cristina; Bilbao, Amaia; Madrazo, Arantza; van Os, Jim; Gonzalez-Torres, Miguel Angel
2016-01-01
Background Facial emotion recognition (FER) is essential to guide social functioning and behaviour for interpersonal communication. FER may be altered in severe mental illness such as in psychosis and in borderline personality disorder patients. However, it is unclear if these FER alterations are specifically related to psychosis. Awareness of FER alterations may be useful in clinical settings to improve treatment strategies. The aim of our study was to examine FER in patients with severe mental disorder and their relation with psychotic symptomatology. Materials and Methods Socio-demographic and clinical variables were collected. Alterations on emotion recognition were assessed in 3 groups: patients with first episode psychosis (FEP) (n = 64), borderline personality patients (BPD) (n = 37) and healthy controls (n = 137), using the Degraded Facial Affect Recognition Task. The Positive and Negative Syndrome Scale, Structured Interview for Schizotypy Revised and Community Assessment of Psychic Experiences scales were used to assess positive psychotic symptoms. WAIS III subtests were used to assess IQ. Results Kruskal-Wallis analysis showed a significant difference between groups on the FER of neutral faces score between FEP, BPD patients and controls and between FEP patients and controls in angry face recognition. No significant differences were found between groups in the fear or happy conditions. There was a significant difference between groups in the attribution of negative emotion to happy faces. BPD and FEP groups had a much higher tendency to recognize happy faces as negatives. There was no association with the different symptom domains in either group. Conclusions FEP and BPD patients have problems in recognizing neutral faces more frequently than controls. Moreover, patients tend to over-report negative emotions in recognition of happy faces. Although no relation between psychotic symptoms and FER alterations was found, these deficits could contribute to a patient’s misinterpretations in daily life. PMID:27467692
Moral Cognition and Multiple Sclerosis: A Neuropsychological Study.
Realmuto, Sabrina; Dodich, Alessandra; Meli, Riccardo; Canessa, Nicola; Ragonese, Paolo; Salemi, Giuseppe; Cerami, Chiara
2018-05-30
Recent literature proved that social cognition impairments may characterize the neuropsychological profile of Multiple Sclerosis (MS) patients. However, little is still known about moral cognition in MS. In this study, we evaluated non-social, social, and moral cognitive performances in 45 relapsing-remitting MS patients. Patients underwent the Brief International Cognitive Assessment for Multiple Sclerosis battery, the Cognitive Estimation and Stroop tasks, the Ekman-60 Faces test, the Reading the Mind in the Eye and Story-based Empathy task. Additionally, a task of moral dilemmas including both "instrumental" and "incidental" conditions was administered to patients. Forty-five age-, gender- and education-matched healthy control subjects (HC) were enrolled for comparisons. The majority of patients (i.e., 77.6%) showed deficits at non-social tasks, particularly in the executive domains. A subset of MS sample (i.e., 24%) presented with emotion recognition and socio-affective processing impairments. Overall, MS patients showed comparable levels of moral judgment with respect to HC. The rate of yes/no response in resolution of moral dilemmas and scores of attribution of emotional valence were comparable between groups. Nevertheless, lower moral permissibility and emotional arousal, particularly for the instrumental dilemmas, characterized the MS profile. Significant correlations between the attribution of emotional valence to moral actions and mentalizing scores emerged. Our findings expand current literature on MS supporting not only deficits in executive and socio-emotional domains but also low levels of permissibility of immoral actions and emotional detachment in the moral judgment process.
Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; D'Esposito, Mark; Vinogradov, Sophia
2013-08-30
Both cognitive and social-cognitive deficits impact functional outcome in schizophrenia. Cognitive remediation studies indicate that targeted cognitive and/or social-cognitive training improves behavioral performance on trained skills. However, the neural effects of training in schizophrenia and their relation to behavioral gains are largely unknown. This study tested whether a 50-h intervention which included both cognitive and social-cognitive training would influence neural mechanisms that support social ccognition. Schizophrenia participants completed a computer-based intervention of either auditory-based cognitive training (AT) plus social-cognition training (SCT) (N=11) or non-specific computer games (CG) (N=11). Assessments included a functional magnetic resonance imaging (fMRI) task of facial emotion recognition, and behavioral measures of cognition, social cognition, and functional outcome. The fMRI results showed the predicted group-by-time interaction. Results were strongest for emotion recognition of happy, surprise and fear: relative to CG participants, AT+SCT participants showed a neural activity increase in bilateral amygdala, right putamen and right medial prefrontal cortex. Across all participants, pre-to-post intervention neural activity increase in these regions predicted behavioral improvement on an independent emotion perception measure (MSCEIT: Perceiving Emotions). Among AT+SCT participants alone, neural activity increase in right amygdala predicted behavioral improvement in emotion perception. The findings indicate that combined cognition and social-cognition training improves neural systems that support social-cognition skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Haut, Kristen; Saxena, Abhishek; Yin, Hong; Carol, Emily; Dodell-Feder, David; Lincoln, Sarah Hope; Tully, Laura; Keshavan, Matcheri; Seidman, Larry J.; Nahum, Mor; Hooker, Christine
2017-01-01
Abstract Background: Deficits in social cognition are prominent features of schizophrenia that play a large role in functional impairments and disability. Performance deficits in these domains are associated with altered activity in functional networks, including those that support social cognitive abilities such as emotion recognition. These social cognitive deficits and alterations in neural networks are present prior to the onset of frank psychotic symptoms and thus present a potential target for intervention in early phases of the illness, including in individuals at clinical high risk (CHR) for psychosis. This study assessed changes in social cognitive functional networks following targeted cognitive training (TCT) in CHR individuals. Methods: 14 CHR subjects (7 male, mean age = 21.9) showing attenuated psychotic symptoms as assessed by the SIPS were included in the study. Subjects underwent a clinical evaluation and a functional MRI session prior to and subsequent to completing 40 hours (8 weeks) of targeted cognitive and social cognitive training using Lumosity and SocialVille. 14 matched healthy control (HC) subjects also underwent a single fMRI session as a comparison group for functional activity. Resting state fMRI was acquired as well as fMRI during performance of an emotion recognition task. Group level differences in BOLD activity between HC and CHR group before TCT, and CHR group before and after TCT were computed. Changes in social cognitive network functional connectivity at rest and during task performance was evaluated using seed-based connectivity analyses and psychophysiological interaction (PPI). Results: Prior to training, CHR individuals demonstrated hyperactivity in the amygdala, posterior cingulate, and superior temporal sulcus (STS) during emotion recognition, suggesting inefficient processing. This hyperactivity normalized somewhat after training, with CHR individuals showing less hyperactivity in the amygdala in response to emotional faces. In addition, training was associated with increased connectivity in emotion processing networks, including greater STS-medial prefrontal connectivity and normalization of amygdala connectivity patterns. Conclusion: These results suggest that targeted cognitive training produced improvements in emotion recognition and may be effective in altering functional network connectivity in networks associated with psychosis risk. TCT may be a useful tool for early intervention in individuals at risk for psychotic disorders to address behaviors that impact functional outcome.
Cognitive Therapy Abilities in People with Learning Disabilities
ERIC Educational Resources Information Center
Sams, Kathryn; Collins, Suzanne; Reynolds, Shirley
2006-01-01
Background: There is a need to develop and adapt therapies for use with people with learning disabilities who have mental health problems. Aims: To examine the performance of people with learning disabilities on two cognitive therapy tasks (emotion recognition and discrimination among thoughts, feelings and behaviours). We hypothesized that…
2014-01-01
Background It is widely accepted that emotion processing difficulties are involved in Autism Spectrum Conditions (ASC). An increasing number of studies have focused on the development of training programs and have shown promising results. However, most of these programs are appropriate for individuals with high-functioning ASC (HFA) but exclude individuals with low-functioning ASC (LFA). We have developed a computer-based game called JeStiMulE based on logical skills to teach emotions to individuals with ASC, independently of their age, intellectual, verbal and academic level. The aim of the present study was to verify the usability of JeStiMulE (which is its adaptability, effectiveness and efficiency) on a heterogeneous ASC group. We hypothesized that after JeStiMulE training, a performance improvement would be found in emotion recognition tasks. Methods A heterogeneous group of thirty-three children and adolescents with ASC received two one-hour JeStiMulE sessions per week over four weeks. In order to verify the usability of JeStiMulE, game data were collected for each participant. Furthermore, all participants were presented before and after training with five emotion recognition tasks, two including pictures of game avatars (faces and gestures) and three including pictures of real-life characters (faces, gestures and social scenes). Results Descriptive data showed suitable adaptability, effectiveness and efficiency of JeStiMulE. Results revealed a significant main effect of Session on avatars (ANOVA: F (1,32) = 98.48, P < .001) and on pictures of real-life characters (ANOVA: F (1,32) = 49.09, P < .001). A significant Session × Task × Emotion interaction was also found for avatars (ANOVA: F (6,192) = 2.84, P = .01). This triple interaction was close to significance for pictures of real-life characters (ANOVA: F (12,384) = 1.73, P = .057). Post-hoc analyses revealed that 30 out of 35 conditions found a significant increase after training. Conclusions JeStiMulE appears to be a promising tool to teach emotion recognition not only to individuals with HFA but also those with LFA. JeStiMulE is thus based on ASC-specific skills, offering a model of logical processing of social information to compensate for difficulties with intuitive social processing. Trial registration Comité de Protection des Personnes Sud Méditerranée V (CPP): reference number 11.046 (https://cpp-sud-mediterranee-v.fr/). PMID:25018866
Villain, Hélène; Benkahoul, Aïcha; Drougard, Anne; Lafragette, Marie; Muzotte, Elodie; Pech, Stéphane; Bui, Eric; Brunet, Alain; Birmes, Philippe; Roullet, Pascal
2016-01-01
Memory reconsolidation impairment using the β-noradrenergic receptor blocker propranolol is a promising novel treatment avenue for patients suffering from pathogenic memories, such as post-traumatic stress disorder (PTSD). However, in order to better inform targeted treatment development, the effects of this compound on memory need to be better characterized via translational research. We examined the effects of systemic propranolol administration in mice undergoing a wide range of behavioral tests to determine more specifically which aspects of the memory consolidation and reconsolidation are impaired by propranolol. We found that propranolol (10 mg/kg) affected memory consolidation in non-aversive tasks (object recognition and object location) but not in moderately (Morris water maze (MWM) to highly (passive avoidance, conditioned taste aversion) aversive tasks. Further, propranolol impaired memory reconsolidation in the most and in the least aversive tasks, but not in the moderately aversive task, suggesting its amnesic effect was not related to task aversion. Moreover, in aquatic object recognition and location tasks in which animals were forced to behave (contrary to the classic versions of the tasks); propranolol did not impair memory reconsolidation. Taken together our results suggest that the memory impairment observed after propranolol administration may result from a modification of the emotional valence of the memory rather than a disruption of the contextual component of the memory trace. This is relevant to the use of propranolol to block memory reconsolidation in individuals with PTSD, as such a treatment would not erase the traumatic memory but only reduce the emotional valence associated with this event. PMID:27014009
Høyland, Anne Lise; Nærland, Terje; Engstrøm, Morten; Lydersen, Stian; Andreassen, Ole Andreas
2017-01-01
An altered processing of emotions may contribute to a reduced ability for social interaction and communication in autism spectrum disorder, ASD. We investigated how face-emotion recognition in ASD is different from typically developing across adolescent age groups. Fifty adolescents diagnosed with ASD and 49 typically developing (age 12-21 years) were included. The ASD diagnosis was underpinned by parent-rated Social Communication Questionnaire. We used a cued GO/ NOGO task with pictures of facial expressions and recorded reaction time, intra-individual variability of reaction time and omissions/commissions. The Social Responsiveness Scale was used as a measure of social function. Analyses were conducted for the whole group and for young (< 16 years) and old (≥ 16 years) age groups. We found no significant differences in any task measures between the whole group of typically developing and ASD and no significant correlations with the Social Responsiveness Scale. However, there was a non-significant tendency for longer reaction time in the young group with ASD (p = 0.099). The Social Responsiveness Scale correlated positively with reaction time (r = 0.30, p = 0.032) and intra-individual variability in reaction time (r = 0.29, p = 0.037) in the young group and in contrast, negatively in the old group (r = -0.23, p = 0.13; r = -0.38, p = 0.011, respectively) giving significant age group interactions for both reaction time (p = 0.008) and intra-individual variability in reaction time (p = 0.001). Our findings suggest an age-dependent association between emotion recognition and severity of social problems indicating a delayed development of emotional understanding in ASD. It also points towards alterations in top-down attention control in the ASD group. This suggests novel disease-related features that should be investigated in more details in experimental settings.
Templier, Lorraine; Chetouani, Mohamed; Plaza, Monique; Belot, Zoé; Bocquet, Patrick; Chaby, Laurence
2015-03-01
Patients with Alzheimer's disease (AD) show cognitive and behavioral disorders, which they and their caregivers have difficulties to cope with in daily life. Psychological symptoms seem to be increased by impaired emotion processing in patients, this ability being linked to social cognition and thus essential to maintain good interpersonal relationships. Non-verbal emotion processing is a genuine way to communicate, especially so for patients whose language may be rapidly impaired. Many studies focus on emotion identification in AD patients, mostly by means of facial expressions rather than emotional prosody; even fewer consider emotional prosody production, despite its playing a key role in interpersonal exchanges. The literature on this subject is scarce with contradictory results. The present study compares the performances of 14 AD patients (88.4±4.9 yrs; MMSE: 19.9±2.7) to those of 14 control subjects (87.5±5.1 yrs; MMSE: 28.1±1.4) in tasks of emotion identification through faces and voices (non linguistic vocal emotion or emotional prosody) and in a task of emotional prosody production (12 sentences were to be pronounced in a neutral, positive, or negative tone, after a context was read). The Alzheimer's disease patients showed weaker performances than control subjects in all emotional recognition tasks and particularly when identifying emotional prosody. A negative relation between the identification scores and the NPI (professional caregivers) scores was found which underlines their link to psychological and behavioral disorders. The production of emotional prosody seems relatively preserved in a mild to moderate stage of the disease: we found subtle differences regarding acoustic parameters but in a qualitative way judges established that the patients' productions were as good as those of control subjects. These results suggest interesting new directions for improving patients' care.
The autistic child's appraisal of expressions of emotion.
Hobson, R P
1986-05-01
Groups of MA-matched autistic, normal and non-autistic retarded children were tested for their ability to choose drawn and photographed facial expressions of emotion to "go with" a person videotaped in gestures, vocalizations and contexts indicative of four emotional states. Although both autistic and control subjects were adept in choosing drawings of non-personal objects to correspond with videotaped cues, the autistic children were markedly impaired in selecting the appropriate faces for the videotaped expressions and contexts. Within the autistic group, the children's performance in this task of emotion recognition was related to MA. It is suggested that autistic children have difficulty in recognizing how different expressions of particular emotions are associated with each other, and that this might contribute to their failure to understand the emotional states of other people.
Disrupted neural processing of emotional faces in psychopathy.
Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís
2014-04-01
Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.
Contemplative/emotion training reduces negative emotional behavior and promotes prosocial responses.
Kemeny, Margaret E; Foltz, Carol; Cavanagh, James F; Cullen, Margaret; Giese-Davis, Janine; Jennings, Patricia; Rosenberg, Erika L; Gillath, Omri; Shaver, Phillip R; Wallace, B Alan; Ekman, Paul
2012-04-01
Contemplative practices are believed to alleviate psychological problems, cultivate prosocial behavior and promote self-awareness. In addition, psychological science has developed tools and models for understanding the mind and promoting well-being. Additional effort is needed to combine frameworks and techniques from these traditions to improve emotional experience and socioemotional behavior. An 8-week intensive (42 hr) meditation/emotion regulation training intervention was designed by experts in contemplative traditions and emotion science to reduce "destructive enactment of emotions" and enhance prosocial responses. Participants were 82 healthy female schoolteachers who were randomly assigned to a training group or a wait-list control group, and assessed preassessment, postassessment, and 5 months after training completion. Assessments included self-reports and experimental tasks to capture changes in emotional behavior. The training group reported reduced trait negative affect, rumination, depression, and anxiety, and increased trait positive affect and mindfulness compared to the control group. On a series of behavioral tasks, the training increased recognition of emotions in others (Micro-Expression Training Tool), protected trainees from some of the psychophysiological effects of an experimental threat to self (Trier Social Stress Test; TSST), appeared to activate cognitive networks associated with compassion (lexical decision procedure), and affected hostile behavior in the Marital Interaction Task. Most effects at postassessment that were examined at follow-up were maintained (excluding positive affect, TSST rumination, and respiratory sinus arrhythmia recovery). Findings suggest that increased awareness of mental processes can influence emotional behavior, and they support the benefit of integrating contemplative theories/practices with psychological models and methods of emotion regulation. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
Miskowiak, K W; Larsen, J E; Harmer, C J; Siebner, H R; Kessing, L V; Macoveanu, J; Vinberg, M
2018-01-15
Negative cognitive bias and aberrant neural processing of self-referent emotional words seem to be trait-marks of depression. However, it is unclear whether these neurocognitive changes are present in unaffected first-degree relatives and constitute an illness endophenotype. Fifty-three healthy, never-depressed monozygotic or dizygotic twins with a co-twin history of depression (high-risk group: n = 26) or no first-degree family history of depression (low-risk group: n = 27) underwent neurocognitive testing and functional magnetic imaging (fMRI) as part of a follow-up cohort study. Participants performed a self-referent emotional word categorisation task and free word recall task followed by a recognition task during fMRI. Participants also completed questionnaires assessing mood, personality traits and coping strategies. High-risk and low-risk twins (age, mean ± SD: 40 ± 11) were well-balanced for demographic variables, mood, coping and neuroticism. High-risk twins showed lower accuracy during self-referent categorisation of emotional words independent of valence and more false recollections of negative words than low-risk twins during free recall. Functional MRI yielded no differences between high-risk and low-risk twins in retrieval-specific neural activity for positive or negative words or during the recognition of negative versus positive words within the hippocampus or prefrontal cortex. The subtle display of negative recall bias is consistent with the hypothesis that self-referent negative memory bias is an endophenotype for depression. High-risk twins' lower categorisation accuracy adds to the evidence for valence-independent cognitive deficits in individuals at familial risk for depression. Copyright © 2017 Elsevier B.V. All rights reserved.
Effects of short-term quetiapine treatment on emotional processing, sleep and circadian rhythms.
Rock, Philippa L; Goodwin, Guy M; Wulff, Katharina; McTavish, Sarah F B; Harmer, Catherine J
2016-03-01
Quetiapine is an atypical antipsychotic that can stabilise mood from any index episode of bipolar disorder. This study investigated the effects of seven-day quetiapine administration on sleep, circadian rhythms and emotional processing in healthy volunteers. Twenty healthy volunteers received 150 mg quetiapine XL for seven nights and 20 matched controls received placebo. Sleep-wake actigraphy was completed for one week both pre-dose and during drug treatment. On Day 8, participants completed emotional processing tasks. Actigraphy revealed that quetiapine treatment increased sleep duration and efficiency, delayed final wake time and had a tendency to reduce within-day variability. There were no effects of quetiapine on subjective ratings of mood or energy. Quetiapine-treated participants showed diminished bias towards positive words and away from negative words during recognition memory. Quetiapine did not significantly affect facial expression recognition, emotional word categorisation, emotion-potentiated startle or emotional word/faces dot-probe vigilance reaction times. These changes in sleep timing and circadian rhythmicity in healthy volunteers may be relevant to quetiapine's therapeutic actions. Effects on emotional processing did not emulate the effects of antidepressants. The effects of quetiapine on sleep and circadian rhythms in patients with bipolar disorder merit further investigation to elucidate its mechanisms of action. © The Author(s) 2016.
Children's Recognition of Emotional Facial Expressions Through Photographs and Drawings.
Brechet, Claire
2017-01-01
The author's purpose was to examine children's recognition of emotional facial expressions, by comparing two types of stimulus: photographs and drawings. The author aimed to investigate whether drawings could be considered as a more evocative material than photographs, as a function of age and emotion. Five- and 7-year-old children were presented with photographs and drawings displaying facial expressions of 4 basic emotions (i.e., happiness, sadness, anger, and fear) and were asked to perform a matching task by pointing to the face corresponding to the target emotion labeled by the experimenter. The photographs we used were selected from the Radboud Faces Database and the drawings were designed on the basis of both the facial components involved in the expression of these emotions and the graphic cues children tend to use when asked to depict these emotions in their own drawings. Our results show that drawings are better recognized than photographs, for sadness, anger, and fear (with no difference for happiness, due to a ceiling effect). And that the difference between the 2 types of stimuli tends to be more important for 5-year-olds compared to 7-year-olds. These results are discussed in view of their implications, both for future research and for practical application.
fMRI of parents of children with Asperger Syndrome: a pilot study.
Baron-Cohen, Simon; Ring, Howard; Chitnis, Xavier; Wheelwright, Sally; Gregory, Lloyd; Williams, Steve; Brammer, Mick; Bullmore, Ed
2006-06-01
People with autism or Asperger Syndrome (AS) show altered patterns of brain activity during visual search and emotion recognition tasks. Autism and AS are genetic conditions and parents may show the 'broader autism phenotype.' (1) To test if parents of children with AS show atypical brain activity during a visual search and an empathy task; (2) to test for sex differences during these tasks at the neural level; (3) to test if parents of children with autism are hyper-masculinized, as might be predicted by the 'extreme male brain' theory. We used fMRI during a visual search task (the Embedded Figures Test (EFT)) and an emotion recognition test (the 'Reading the Mind in the Eyes' (or Eyes) test). Twelve parents of children with AS, vs. 12 sex-matched controls. Factorial analysis was used to map main effects of sex, group (parents vs. controls), and sexxgroup interaction on brain function. An ordinal ANOVA also tested for regions of brain activity where females>males>fathers=mothers, to test for parental hyper-masculinization. RESULTS ON EFT TASK: Female controls showed more activity in extrastriate cortex than male controls, and both mothers and fathers showed even less activity in this area than sex-matched controls. There were no differences in group activation between mothers and fathers of children with AS. The ordinal ANOVA identified two specific regions in visual cortex (right and left, respectively) that showed the pattern Females>Males>Fathers=Mothers, both in BA 19. RESULTS ON EYES TASK: Male controls showed more activity in the left inferior frontal gyrus than female controls, and both mothers and fathers showed even more activity in this area compared to sex-matched controls. Female controls showed greater bilateral inferior frontal activation than males. This was not seen when comparing mothers to males, or mothers to fathers. The ordinal ANOVA identified two specific regions that showed the pattern Females>Males>Mothers=Fathers: left medial temporal gyrus (BA 21) and left dorsolateral prefrontal cortex (BA 44). Parents of children with AS show atypical brain function during both visual search and emotion recognition, in the direction of hyper-masculinization of the brain. Because of the small sample size, and lack of age-matching between parents and controls, such results constitute a pilot study that needs replicating with larger samples.
Poor sleep quality is associated with a negative cognitive bias and decreased sustained attention.
Gobin, Christina M; Banks, Jonathan B; Fins, Ana I; Tartar, Jaime L
2015-10-01
Poor sleep quality has been demonstrated to diminish cognitive performance, impair psychosocial functioning and alter the perception of stress. At present, however, there is little understanding of how sleep quality affects emotion processing. The aim of the present study was to determine the extent to which sleep quality, measured through the Pittsburg Sleep Quality Index, influences affective symptoms as well as the interaction between stress and performance on an emotional memory test and sustained attention task. To that end, 154 undergraduate students (mean age: 21.27 years, standard deviation = 4.03) completed a series of measures, including the Pittsburg Sleep Quality Index, the Sustained Attention to Response Task, an emotion picture recognition task and affective symptom questionnaires following either a control or physical stress manipulation, the cold pressor test. As sleep quality and psychosocial functioning differ among chronotypes, we also included chronotype and time of day as variables of interest to ensure that the effects of sleep quality on the emotional and non-emotional tasks were not attributed to these related factors. We found that poor sleep quality is related to greater depressive symptoms, anxiety and mood disturbances. While an overall relationship between global Pittsburg Sleep Quality Index score and emotion and attention measures was not supported, poor sleep quality, as an independent component, was associated with better memory for negative stimuli and a deficit in sustained attention to non-emotional stimuli. Importantly, these effects were not sensitive to stress, chronotype or time of day. Combined, these results suggest that individuals with poor sleep quality show an increase in affective symptomatology as well as a negative cognitive bias with a concomitant decrease in sustained attention to non-emotional stimuli. © 2015 European Sleep Research Society.
Adrenergic enhancement of consolidation of object recognition memory.
Dornelles, Arethuza; de Lima, Maria Noemia Martins; Grazziotin, Manoela; Presti-Torres, Juliana; Garcia, Vanessa Athaide; Scalco, Felipe Siciliani; Roesler, Rafael; Schröder, Nadja
2007-07-01
Extensive evidence indicates that epinephrine (EPI) modulates memory consolidation for emotionally arousing tasks in animals and human subjects. However, previous studies have not examined the effects of EPI on consolidation of recognition memory. Here we report that systemic administration of EPI enhances consolidation of memory for a novel object recognition (NOR) task under different training conditions. Control male rats given a systemic injection of saline (0.9% NaCl) immediately after NOR training showed significant memory retention when tested at 1.5 or 24, but not 96h after training. In contrast, rats given a post-training injection of EPI showed significant retention of NOR at all delays. In a second experiment using a different training condition, rats treated with EPI, but not SAL-treated animals, showed significant NOR retention at both 1.5 and 24-h delays. We next showed that the EPI-induced enhancement of retention tested at 96h after training was prevented by pretraining systemic administration of the beta-adrenoceptor antagonist propranolol. The findings suggest that, as previously observed in experiments using aversively motivated tasks, epinephrine modulates consolidation of recognition memory and that the effects require activation of beta-adrenoceptors.
Martinez, Maria; Multani, Namita; Anor, Cassandra J.; Misquitta, Karen; Tang-Wai, David F.; Keren, Ron; Fox, Susan; Lang, Anthony E.; Marras, Connie; Tartaglia, Maria C.
2018-01-01
Background: Changes in social cognition occur in patients with Alzheimer’s disease (AD) and Parkinson’s disease (PD) and can be caused by several factors, including emotion recognition deficits and neuropsychiatric symptoms (NPS). The aims of this study were to investigate: (1) group differences on emotion detection between patients diagnosed with AD or PD and their respective caregivers; (2) the association of emotion detection with empathetic ability and NPS in individuals with AD or PD; (3) caregivers’ depression and perceived burden in relation to patients’ ability to detect emotions, empathize with others, presence of NPS; and (4) caregiver’s awareness of emotion detection deficits in patients with AD or Parkinson. Methods: In this study, patients with probable AD (N = 25) or PD (N = 17), and their caregivers (N = 42), performed an emotion detection task (The Awareness of Social Inference Test—Emotion Evaluation Test, TASIT-EET). Patients underwent cognitive assessment, using the Behavioral Neurology Assessment (BNA). In addition, caregivers completed questionnaires to measure empathy (Interpersonal Reactivity Index, IRI) and NPS (Neuropsychiatric Inventory, NPI) in patients and self-reported on depression (Geriatric Depression Scale, GDS) and burden (Zarit Burden Interview, ZBI). Caregivers were also interviewed to measure dementia severity (Clinical Dementia Rating (CDR) Scale) in patients. Results: The results suggest that individuals with AD and PD are significantly worse at recognizing emotions than their caregivers. Moreover, caregivers failed to recognize patients’ emotion recognition deficits and this was associated with increased caregiver burden and depression. Patients’ emotion recognition deficits, decreased empathy and NPS were also related to caregiver burden and depression. Conclusions: Changes in emotion detection and empathy in individuals with AD and PD has implications for caregiver burden and depression and may be amenable to interventions with both patients and caregivers. PMID:29740312
Textual emotion recognition for enhancing enterprise computing
NASA Astrophysics Data System (ADS)
Quan, Changqin; Ren, Fuji
2016-05-01
The growing interest in affective computing (AC) brings a lot of valuable research topics that can meet different application demands in enterprise systems. The present study explores a sub area of AC techniques - textual emotion recognition for enhancing enterprise computing. Multi-label emotion recognition in text is able to provide a more comprehensive understanding of emotions than single label emotion recognition. A representation of 'emotion state in text' is proposed to encompass the multidimensional emotions in text. It ensures the description in a formal way of the configurations of basic emotions as well as of the relations between them. Our method allows recognition of the emotions for the words bear indirect emotions, emotion ambiguity and multiple emotions. We further investigate the effect of word order for emotional expression by comparing the performances of bag-of-words model and sequence model for multi-label sentence emotion recognition. The experiments show that the classification results under sequence model are better than under bag-of-words model. And homogeneous Markov model showed promising results of multi-label sentence emotion recognition. This emotion recognition system is able to provide a convenient way to acquire valuable emotion information and to improve enterprise competitive ability in many aspects.
Harmer, Catherine J; Shelley, Nicholas C; Cowen, Philip J; Goodwin, Guy M
2004-07-01
Antidepressants that inhibit the reuptake of serotonin (SSRIs) or norepinephrine (SNRIs) are effective in the treatment of disorders such as depression and anxiety. Cognitive psychological theories emphasize the importance of correcting negative biases of information processing in the nonpharmacological treatment of these disorders, but it is not known whether antidepressant drugs can directly modulate the neural processing of affective information. The present study therefore assessed the actions of repeated antidepressant administration on perception and memory for positive and negative emotional information in healthy volunteers. Forty-two male and female volunteers were randomly assigned to 7 days of double-blind intervention with the SSRI citalopram (20 mg/day), the SNRI reboxetine (8 mg/day), or placebo. On the final day, facial expression recognition, emotion-potentiated startle response, and memory for affect-laden words were assessed. Questionnaires monitoring mood, hostility, and anxiety were given before and after treatment. In the facial expression recognition task, citalopram and reboxetine reduced the identification of the negative facial expressions of anger and fear. Citalopram also abolished the increased startle response found in the context of negative affective images. Both antidepressants increased the relative recall of positive (versus negative) emotional material. These changes in emotional processing occurred in the absence of significant differences in ratings of mood and anxiety. However, reboxetine decreased subjective ratings of hostility and elevated energy. Short-term administration of two different antidepressant types had similar effects on emotion-related tasks in healthy volunteers, reducing the processing of negative relative to positive emotional material. Such effects of antidepressants may ameliorate the negative biases in information processing that characterize mood and anxiety disorders. They also suggest a mechanism of action potentially compatible with cognitive theories of anxiety and depression.
Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.
Yamauchi, Takashi; Xiao, Kunchen
2018-04-01
Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.
Risk for bipolar disorder is associated with face-processing deficits across emotions.
Brotman, Melissa A; Skup, Martha; Rich, Brendan A; Blair, Karina S; Pine, Daniel S; Blair, James R; Leibenluft, Ellen
2008-12-01
Youths with euthymic bipolar disorder (BD) have a deficit in face-emotion labeling that is present across multiple emotions. Recent research indicates that youths at familial risk for BD, but without a history of mood disorder, also have a deficit in face-emotion labeling, suggesting that such impairments may be an endophenotype for BD. It is unclear whether this deficit in at-risk youths is present across all emotions or if the impairment presents initially as an emotion-specific dysfunction that then generalizes to other emotions as the symptoms of BD become manifest. Thirty-seven patients with pediatric BD, 25 unaffected children with a first-degree relative with BD, and 36 typically developing youths were administered the Emotional Expression Multimorph Task, a computerized behavioral task, which presents gradations of facial emotions from 100% neutrality to 100% emotional expression (happiness, surprise, fear, sadness, anger, and disgust). Repeated-measures analysis of covariance revealed that, compared with the control youths, the patients and the at-risk youths required significantly more intense emotional information to identify and correctly label face emotions. The patients with BD and the at-risk youths did not differ from each other. Group-by-emotion interactions were not significant, indicating that the group effects did not differ based on the facial emotion. The youths at risk for BD demonstrate nonspecific deficits in face-emotion recognition, similar to patients with the illness. Further research is needed to determine whether such deficits meet all the criteria for an endophenotype.
Functional architecture of visual emotion recognition ability: A latent variable approach.
Lewis, Gary J; Lefevre, Carmen E; Young, Andrew W
2016-05-01
Emotion recognition has been a focus of considerable attention for several decades. However, despite this interest, the underlying structure of individual differences in emotion recognition ability has been largely overlooked and thus is poorly understood. For example, limited knowledge exists concerning whether recognition ability for one emotion (e.g., disgust) generalizes to other emotions (e.g., anger, fear). Furthermore, it is unclear whether emotion recognition ability generalizes across modalities, such that those who are good at recognizing emotions from the face, for example, are also good at identifying emotions from nonfacial cues (such as cues conveyed via the body). The primary goal of the current set of studies was to address these questions through establishing the structure of individual differences in visual emotion recognition ability. In three independent samples (Study 1: n = 640; Study 2: n = 389; Study 3: n = 303), we observed that the ability to recognize visually presented emotions is based on different sources of variation: a supramodal emotion-general factor, supramodal emotion-specific factors, and face- and within-modality emotion-specific factors. In addition, we found evidence that general intelligence and alexithymia were associated with supramodal emotion recognition ability. Autism-like traits, empathic concern, and alexithymia were independently associated with face-specific emotion recognition ability. These results (a) provide a platform for further individual differences research on emotion recognition ability, (b) indicate that differentiating levels within the architecture of emotion recognition ability is of high importance, and (c) show that the capacity to understand expressions of emotion in others is linked to broader affective and cognitive processes. (c) 2016 APA, all rights reserved).
Conson, Massimiliano; Errico, Domenico; Mazzarella, Elisabetta; Giordano, Marianna; Grossi, Dario; Trojano, Luigi
2015-01-01
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person's visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants' tendency to adopt another's point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males' responses to threatening faces whereas it interferes with the ability to adopt another's viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.
Emotional intelligence in incarcerated men with psychopathic traits
Ermer, Elsa; Kahn, Rachel E.; Salovey, Peter; Kiehl, Kent A.
2012-01-01
The expression, recognition, and communication of emotional states are ubiquitous features of the human social world. Emotional intelligence (EI) is defined as the ability to perceive, manage, and reason about emotions, in oneself and others. Individuals with psychopathy have numerous difficulties in social interaction and show impairment on some emotional tasks. Here we investigate the relation between emotional intelligence and psychopathy in a sample of incarcerated men (n=374), using the Psychopathy Checklist—Revised (PCL-R; Hare, 2003) and the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT; Mayer, Salovey, & Caruso, 2002). The MSCEIT is a well-validated ability-based emotional intelligence measure that does not rely on self-report judgments of emotional skills. The Hare PCL-R is the gold-standard for the assessment of psychopathy in clinical populations. Controlling for general intelligence, psychopathy was associated with lower emotional intelligence. These findings suggest individuals with psychopathy are impaired on a range of emotional intelligence abilities and that emotional intelligence is an important area for understanding deficits in psychopathy. PMID:22329657
Shafer, Andrea T.; Dolcos, Florin
2014-01-01
The memory-enhancing effect of emotion has been linked to the engagement of emotion- and memory-related medial temporal lobe (MTL) regions (amygdala-AMY; hippocampus-HC; parahippocampus-PHC), during both encoding and retrieval. However, recognition tasks used to investigate the neural correlates of retrieval make it difficult to distinguish MTL engagement linked to retrieval success (RS) from that linked to incidental encoding success (ES) during retrieval. This issue has been investigated for retrieval of non-emotional memories, but not for emotional memory retrieval. To address this, we used event-related functional MRI in conjunction with an emotional distraction and two episodic memory tasks (one testing memory for distracter items and the other testing memory for new/lure items presented in the first memory task). This paradigm allowed for dissociation of MTL activity specifically linked to RS from that linked to both RS and incidental ES during retrieval. There were two novel findings regarding the neural correlates of emotional memory retrieval. First, greater emotional RS was identified bilaterally in AMY, HC, and PHC. However, AMY activity was most impacted when accounting for ES activity, as only RS activity in left AMY was dissociated from ES activity during retrieval, whereas portions of HC and PHC showing greater emotional RS were largely uninvolved in ES. Second, an earlier and more anteriorly spread response (left AMY and bilateral HC, PHC) was linked to greater emotional RS activity, whereas a later and more posteriorly localized response (right posterior PHC) was linked to greater neutral RS activity. These findings shed light on MTL mechanisms subserving the memory-enhancing effect of emotion at retrieval. PMID:24917798
Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.
Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976
Kamboj, Sunjeev K; Joye, Alyssa; Bisby, James A; Das, Ravi K; Platt, Bradley; Curran, H Valerie
2013-05-01
Studies of affect recognition can inform our understanding of the interpersonal effects of alcohol and help develop a more complete neuropsychological profile of this drug. The objective of the study was to examine affect recognition in social drinkers using a novel dynamic affect-recognition task, sampling performance across a range of evolutionarily significant target emotions and neutral expressions. Participants received 0, 0.4 or 0.8 g/kg alcohol in a double-blind, independent groups design. Relatively naturalistic changes in facial expression-from neutral (mouth open) to increasing intensities of target emotions, as well as neutral (mouth closed)-were simulated using computer-generated dynamic morphs. Accuracy and reaction time were measured and a two-high-threshold model applied to hits and false-alarm data to determine sensitivity and response bias. While there was no effect on the principal emotion expressions (happiness, sadness, fear, anger and disgust), compared to those receiving 0.8 g/kg of alcohol and placebo, participants administered with 0.4 g/kg alcohol tended to show an enhanced response bias to neutral expressions. Exploration of this effect suggested an accompanying tendency to misattribute neutrality to sad expressions following the 0.4-g/kg dose. The 0.4-g/kg alcohol-but not 0.8 g/kg-produced a limited and specific modification in affect recognition evidenced by a neutral response bias and possibly an accompanying tendency to misclassify sad expressions as neutral. In light of previous findings on involuntary negative memory following the 0.4-g/kg dose, we suggest that moderate-but not high-doses of alcohol have a special relevance to emotional processing in social drinkers.
Voice emotion recognition by cochlear-implanted children and their normally-hearing peers
Chatterjee, Monita; Zion, Danielle; Deroche, Mickael L.; Burianek, Brooke; Limb, Charles; Goren, Alison; Kulkarni, Aditya M.; Christensen, Julie A.
2014-01-01
Despite their remarkable success in bringing spoken language to hearing impaired listeners, the signal transmitted through cochlear implants (CIs) remains impoverished in spectro-temporal fine structure. As a consequence, pitch-dominant information such as voice emotion, is diminished. For young children, the ability to correctly identify the mood/intent of the speaker (which may not always be visible in their facial expression) is an important aspect of social and linguistic development. Previous work in the field has shown that children with cochlear implants (cCI) have significant deficits in voice emotion recognition relative to their normally hearing peers (cNH). Here, we report on voice emotion recognition by a cohort of 36 school-aged cCI. Additionally, we provide for the first time, a comparison of their performance to that of cNH and NH adults (aNH) listening to CI simulations of the same stimuli. We also provide comparisons to the performance of adult listeners with CIs (aCI), most of whom learned language primarily through normal acoustic hearing. Results indicate that, despite strong variability, on average, cCI perform similarly to their adult counterparts; that both groups’ mean performance is similar to aNHs’ performance with 8-channel noise-vocoded speech; that cNH achieve excellent scores in voice emotion recognition with full-spectrum speech, but on average, show significantly poorer scores than aNH with 8-channel noise-vocoded speech. A strong developmental effect was observed in the cNH with noise-vocoded speech in this task. These results point to the considerable benefit obtained by cochlear-implanted children from their devices, but also underscore the need for further research and development in this important and neglected area. PMID:25448167
Time-limited effects of emotional arousal on item and source memory.
Wang, Bo; Sun, Bukuan
2015-01-01
Two experiments investigated the time-limited effects of emotional arousal on consolidation of item and source memory. In Experiment 1, participants memorized words (items) and the corresponding speakers (sources) and then took an immediate free recall test. Then they watched a neutral, positive, or negative video 5, 35, or 50 min after learning, and 24 hours later they took surprise memory tests. Experiment 2 was similar to Experiment 1 except that (a) a reality monitoring task was used; (b) elicitation delays of 5, 30, and 45 min were used; and (c) delayed memory tests were given 60 min after learning. Both experiments showed that, regardless of elicitation delay, emotional arousal did not enhance item recall memory. Second, both experiments showed that negative arousal enhanced delayed item recognition memory only at the medium elicitation delay, but not in the shorter or longer delays. Positive arousal enhanced performance only in Experiment 1. Third, regardless of elicitation delay, emotional arousal had little effect on source memory. These findings have implications for theories of emotion and memory, suggesting that emotion effects are contingent upon the nature of the memory task and elicitation delay.
Moeller, Sara K; Lee, Elizabeth A Ewing; Robinson, Michael D
2011-08-01
Dominance and submission constitute fundamentally different social interaction strategies that may be enacted most effectively to the extent that the emotions of others are relatively ignored (dominance) versus noticed (submission). On the basis of such considerations, we hypothesized a systematic relationship between chronic tendencies toward high versus low levels of interpersonal dominance and emotion decoding accuracy in objective tasks. In two studies (total N = 232), interpersonally dominant individuals exhibited poorer levels of emotion recognition in response to audio and video clips (Study 1) and facial expressions of emotion (Study 2). The results provide a novel perspective on interpersonal dominance, suggest its strategic nature (Study 2), and are discussed in relation to Fiske's (1993) social-cognitive theory of power. 2011 APA, all rights reserved
Gillespie, Steven M.; Rotshtein, Pia; Satherley, Rose-Marie; Beech, Anthony R.; Mitchell, Ian J.
2015-01-01
Research with violent offenders has consistently shown impaired recognition of other’s facial expressions of emotion. However, the extent to which similar problems can be observed among sexual offenders remains unknown. Using a computerized task, we presented sexual and violent offenders, and non-offenders, with male and female expressions of anger, disgust, fear, happiness, sadness, and surprise, morphed with neutral expressions at varying levels of intensity (10, 55, and 90% expressive). Based on signal detection theory, we used hit rates and false alarms to calculate the sensitivity index d-prime (d′) and criterion (c) for each emotional expression. Overall, sexual offenders showed reduced sensitivity to emotional expressions across intensity, sex, and type of expression, compared with non-offenders, while both sexual and violent offenders showed particular reduced sensitivity to fearful expressions. We also observed specific effects for high (90%) intensity female faces, with sexual offenders showing reduced sensitivity to anger compared with non-offenders and violent offenders, and reduced sensitivity to disgust compared with non-offenders. Furthermore, both sexual and violent offenders showed impaired sensitivity to high intensity female fearful expressions compared with non-offenders. Violent offenders also showed a higher criterion for classifying moderate and high intensity male expressions as fearful, indicative of a more conservative response style, compared with angry, happy, or sad. These results suggest that both types of offender show problems in emotion recognition, and may have implications for understanding the inhibition of violent and sexually violent behaviors. PMID:26029137
Gillespie, Steven M; Rotshtein, Pia; Satherley, Rose-Marie; Beech, Anthony R; Mitchell, Ian J
2015-01-01
Research with violent offenders has consistently shown impaired recognition of other's facial expressions of emotion. However, the extent to which similar problems can be observed among sexual offenders remains unknown. Using a computerized task, we presented sexual and violent offenders, and non-offenders, with male and female expressions of anger, disgust, fear, happiness, sadness, and surprise, morphed with neutral expressions at varying levels of intensity (10, 55, and 90% expressive). Based on signal detection theory, we used hit rates and false alarms to calculate the sensitivity index d-prime (d') and criterion (c) for each emotional expression. Overall, sexual offenders showed reduced sensitivity to emotional expressions across intensity, sex, and type of expression, compared with non-offenders, while both sexual and violent offenders showed particular reduced sensitivity to fearful expressions. We also observed specific effects for high (90%) intensity female faces, with sexual offenders showing reduced sensitivity to anger compared with non-offenders and violent offenders, and reduced sensitivity to disgust compared with non-offenders. Furthermore, both sexual and violent offenders showed impaired sensitivity to high intensity female fearful expressions compared with non-offenders. Violent offenders also showed a higher criterion for classifying moderate and high intensity male expressions as fearful, indicative of a more conservative response style, compared with angry, happy, or sad. These results suggest that both types of offender show problems in emotion recognition, and may have implications for understanding the inhibition of violent and sexually violent behaviors.
Emotion and Theory of Mind in Schizophrenia-Investigating the Role of the Cerebellum.
Mothersill, Omar; Knee-Zaska, Charlotte; Donohoe, Gary
2016-06-01
Social cognitive dysfunction, including deficits in facial emotion recognition and theory of mind, is a core feature of schizophrenia and more strongly predicts functional outcome than neurocognition alone. Although traditionally considered to play an important role in motor coordination, the cerebellum has been suggested to play a role in emotion processing and theory of mind, and also shows structural and functional abnormalities in schizophrenia. The aim of this systematic review was to investigate the specific role of the cerebellum in emotion and theory of mind deficits in schizophrenia using previously published functional neuroimaging studies. PubMed and PsycINFO were used to search for all functional neuroimaging studies reporting altered cerebellum activity in schizophrenia patients during emotion processing or theory of mind tasks, published until December 2014. Overall, 14 functional neuroimaging studies were retrieved. Most emotion studies reported lower cerebellum activity in schizophrenia patients relative to healthy controls. In contrast, the theory of mind studies reported mixed findings. Altered activity was observed across several posterior cerebellar regions involved in emotion and cognition. Weaker cerebellum activity in schizophrenia patients relative to healthy controls during emotion processing may contribute to blunted affect and reduced ability to recognise emotion in others. This research could be expanded by examining the relationship between cerebellum function, symptomatology and behaviour, and examining cerebellum functional connectivity in patients during emotion and theory of mind tasks.
MDMA enhances emotional empathy and prosocial behavior
Hysek, Cédric M.; Schmid, Yasmin; Simmler, Linda D.; Domes, Gregor; Heinrichs, Markus; Eisenegger, Christoph; Preller, Katrin H.; Quednow, Boris B.
2014-01-01
3,4-Methylenedioxymethamphetamine (MDMA, ‘ecstasy’) releases serotonin and norepinephrine. MDMA is reported to produce empathogenic and prosocial feelings. It is unknown whether MDMA in fact alters empathic concern and prosocial behavior. We investigated the acute effects of MDMA using the Multifaceted Empathy Test (MET), dynamic Face Emotion Recognition Task (FERT) and Social Value Orientation (SVO) test. We also assessed effects of MDMA on plasma levels of hormones involved in social behavior using a placebo-controlled, double-blind, random-order, cross-over design in 32 healthy volunteers (16 women). MDMA enhanced explicit and implicit emotional empathy in the MET and increased prosocial behavior in the SVO test in men. MDMA did not alter cognitive empathy in the MET but impaired the identification of negative emotions, including fearful, angry and sad faces, in the FERT, particularly in women. MDMA increased plasma levels of cortisol and prolactin, which are markers of serotonergic and noradrenergic activity, and of oxytocin, which has been associated with prosocial behavior. In summary, MDMA sex-specifically altered the recognition of emotions, emotional empathy and prosociality. These effects likely enhance sociability when MDMA is used recreationally and may be useful when MDMA is administered in conjunction with psychotherapy in patients with social dysfunction or post-traumatic stress disorder. PMID:24097374
Facial recognition of happiness among older adults with active and remitted major depression.
Shiroma, Paulo R; Thuras, Paul; Johns, Brian; Lim, Kelvin O
2016-09-30
Biased emotion processing in depression might be a trait characteristic independent of mood improvement and a vulnerable factor to develop further depressive episodes. This phenomenon of among older adults with depression has not been adequately examined. In a 2-year cross-sectional study, 59 older patients with either active or remitted major depression, or never-depressed, completed a facial emotion recognition task (FERT) to probe perceptual bias of happiness. The results showed that depressed patients, compared with never depressed subjects, had a significant lower sensitivity to identify happiness particularly at moderate intensity of facial stimuli. Patients in remission from a previous major depressive episode but with none or minimal symptoms had similar sensitivity rate to identify happy facial expressions as compared to patients with an active depressive episode. Further studies would be necessary to confirm whether recognition of happy expression reflects a persistent perceptual bias of major depression in older adults. Published by Elsevier Ireland Ltd.
Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C
2013-05-15
Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (p<0.05) impaired recognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Morin, Alain; Hamper, Breanne
2012-01-01
Inner speech involvement in self-reflection was examined by reviewing 130 studies assessing brain activation during self-referential processing in key self-domains: agency, self-recognition, emotions, personality traits, autobiographical memory, and miscellaneous (e.g., prospection, judgments). The left inferior frontal gyrus (LIFG) has been shown to be reliably recruited during inner speech production. The percentage of studies reporting LIFG activity for each self-dimension was calculated. Fifty five percent of all studies reviewed indicated LIFG (and presumably inner speech) activity during self-reflection tasks; on average LIFG activation is observed 16% of the time during completion of non-self tasks (e.g., attention, perception). The highest LIFG activation rate was observed during retrieval of autobiographical information. The LIFG was significantly more recruited during conceptual tasks (e.g., prospection, traits) than during perceptual tasks (agency and self-recognition). This constitutes additional evidence supporting the idea of a participation of inner speech in self-related thinking. PMID:23049653
Morin, Alain; Hamper, Breanne
2012-01-01
Inner speech involvement in self-reflection was examined by reviewing 130 studies assessing brain activation during self-referential processing in key self-domains: agency, self-recognition, emotions, personality traits, autobiographical memory, and miscellaneous (e.g., prospection, judgments). The left inferior frontal gyrus (LIFG) has been shown to be reliably recruited during inner speech production. The percentage of studies reporting LIFG activity for each self-dimension was calculated. Fifty five percent of all studies reviewed indicated LIFG (and presumably inner speech) activity during self-reflection tasks; on average LIFG activation is observed 16% of the time during completion of non-self tasks (e.g., attention, perception). The highest LIFG activation rate was observed during retrieval of autobiographical information. The LIFG was significantly more recruited during conceptual tasks (e.g., prospection, traits) than during perceptual tasks (agency and self-recognition). This constitutes additional evidence supporting the idea of a participation of inner speech in self-related thinking.
The relationships between trait anxiety, place recognition memory, and learning strategy.
Hawley, Wayne R; Grissom, Elin M; Dohanich, Gary P
2011-01-20
Rodents learn to navigate mazes using various strategies that are governed by specific regions of the brain. The type of strategy used when learning to navigate a spatial environment is moderated by a number of factors including emotional states. Heightened anxiety states, induced by exposure to stressors or administration of anxiogenic agents, have been found to bias male rats toward the use of a striatum-based stimulus-response strategy rather than a hippocampus-based place strategy. However, no study has yet examined the relationship between natural anxiety levels, or trait anxiety, and the type of learning strategy used by rats on a dual-solution task. In the current experiment, levels of inherent anxiety were measured in an open field and compared to performance on two separate cognitive tasks, a Y-maze task that assessed place recognition memory, and a visible platform water maze task that assessed learning strategy. Results indicated that place recognition memory on the Y-maze correlated with the use of place learning strategy on the water maze. Furthermore, lower levels of trait anxiety correlated positively with better place recognition memory and with the preferred use of place learning strategy. Therefore, competency in place memory and bias in place strategy are linked to the levels of inherent anxiety in male rats. Copyright © 2010 Elsevier B.V. All rights reserved.
The hierarchical brain network for face recognition.
Zhen, Zonglei; Fang, Huizhen; Liu, Jia
2013-01-01
Numerous functional magnetic resonance imaging (fMRI) studies have identified multiple cortical regions that are involved in face processing in the human brain. However, few studies have characterized the face-processing network as a functioning whole. In this study, we used fMRI to identify face-selective regions in the entire brain and then explore the hierarchical structure of the face-processing network by analyzing functional connectivity among these regions. We identified twenty-five regions mainly in the occipital, temporal and frontal cortex that showed a reliable response selective to faces (versus objects) across participants and across scan sessions. Furthermore, these regions were clustered into three relatively independent sub-networks in a face-recognition task on the basis of the strength of functional connectivity among them. The functionality of the sub-networks likely corresponds to the recognition of individual identity, retrieval of semantic knowledge and representation of emotional information. Interestingly, when the task was switched to object recognition from face recognition, the functional connectivity between the inferior occipital gyrus and the rest of the face-selective regions were significantly reduced, suggesting that this region may serve as an entry node in the face-processing network. In sum, our study provides empirical evidence for cognitive and neural models of face recognition and helps elucidate the neural mechanisms underlying face recognition at the network level.
Neutral details associated with emotional events are encoded: evidence from a cued recall paradigm.
Mickley Steinmetz, Katherine R; Knight, Aubrey G; Kensinger, Elizabeth A
2016-11-01
Enhanced emotional memory often comes at the cost of memory for surrounding background information. Narrowed-encoding theories suggest that this is due to narrowed attention for emotional information at encoding, leading to impaired encoding of background information. Recent work has suggested that an encoding-based theory may be insufficient. Here, we examined whether cued recall-instead of previously used recognition memory tasks-would reveal evidence that non-emotional information associated with emotional information was effectively encoded. Participants encoded positive, negative, or neutral objects on neutral backgrounds. At retrieval, they were given either the item or the background as a memory cue and were asked to recall the associated scene element. Counter to narrowed-encoding theories, emotional items were more likely than neutral items to trigger recall of the associated background. This finding suggests that there is a memory trace of this contextual information and that emotional cues may facilitate retrieval of this information.
Effects of stress on emotional memory in patients with Alzheimer's disease and in healthy elderly.
Gómez-Gallego, María; Gómez-García, Juan
2017-12-14
We aimed at examining the relation between stress markers (cortisol levels and state anxiety) with memory for emotional information in AD patients and in healthy elderly. Baseline and changes in stress markers during memory testing were assessed in a sample of 98 elderly (46 mild-to-moderate Alzheimer's disease patients and 52 controls) recruited from dementia day centers and adult day centers, respectively. Salivary cortisol, state anxiety, and measures of immediate recall and delayed recognition using the International Affective Pictures System. Patients' performance in memory tasks was not associated with either cortisol levels or anxiety. In controls, quadratic and linear associations were found between cortisol and immediate recall scores (total and bias, respectively). Besides, quadratic and linear associations were observed between anxiety and delayed recognition scores (total and bias, respectively). The emotional memory of patients with Alzheimer´s disease is not related to stress markers as healthy older adults' is. Future studies that include moderating variables are needed to explain the lack of association.
MacKay, Donald G; Shafto, Meredith; Taylor, Jennifer K; Marian, Diane E; Abrams, Lise; Dyer, Jennifer R
2004-04-01
This article reports five experiments demonstrating theoretically coherent effects of emotion on memory and attention. Experiments 1-3 demonstrated three taboo Stroop effects that occur when people name the color of taboo words. One effect is longer color-naming times for taboo than for neutral words, an effect that diminishes with word repetition. The second effect is superior recall of taboo words in surprise memory tests following color naming. The third effect is better recognition memory for colors consistently associated with taboo words rather than with neutral words. None of these effects was due to retrieval factors, attentional disengagement processes, response inhibition, or strategic attention shifts. Experiments 4 and 5 demonstrated that taboo words impair immediate recall of the preceding and succeeding words in rapidly presented lists but do not impair lexical decision times. We argue that taboo words trigger specific emotional reactions that facilitate the binding of taboo word meaning to salient contextual aspects, such as occurrence in a task and font color in taboo Stroop tasks.
A comparison of basic and social cognition between schizophrenia and schizoaffective disorder.
Fiszdon, Joanna M; Richardson, Randall; Greig, Tamasine; Bell, Morris D
2007-03-01
We compared basic and social cognition in individuals with schizophrenia and schizoaffective disorder. 199 individuals with schizophrenia and 73 with schizoaffective disorder were compared on measures of executive function, verbal and nonverbal memory, and processing speed, as well as two measures of social cognition, the Hinting Task and the Bell Lysaker Emotion Recognition Task. The samples did not differ significantly on the basic cognitive measures, however individuals with schizoaffective disorder performed significantly better than those with schizophrenia on the Hinting Task, a measure of Theory of Mind. Results provide limited support for a taxonomic distinction between the two disorders.
Narme, Pauline; Mouras, Harold; Roussel, Martine; Devendeville, Agnès; Godefroy, Olivier
2013-01-01
We explored the value of a battery of socioemotional tasks for differentiating between frontotemporal lobar degeneration (FTLD) and Alzheimer's disease (AD). Patients with FTLD (n = 13) or AD (n = 13) and healthy controls (n = 26) underwent a neuropsychological assessment and the socioemotional battery (an empathy questionnaire, an emotion recognition task, and theory of mind tasks). Socioemotional processes were markedly impaired in FTLD but relatively unaffected in mild AD. The computed Socioemotional Index discriminated more accurately between FTLD from AD than behavioral and executive assessments did. Furthermore, impairments in socioemotional processes were correlated with indifference to others.
The Social Cognition Psychometric Evaluation Study: Results of the Expert Survey and RAND Panel
Pinkham, Amy E.; Penn, David L.; Green, Michael F.; Buck, Benjamin; Healey, Kristin; Harvey, Philip D.
2014-01-01
Background: In schizophrenia, social cognition is strongly linked to functional outcome and is increasingly seen as a viable treatment target. The goal of the Social Cognition Psychometric Evaluation (SCOPE) study is to identify and improve the best existing measures of social cognition so they can be suitably applied in large-scale treatment studies. Initial phases of this project sought to (1) develop consensus on critical domains of social cognition and (2) identify the best existing measures of social cognition for use in treatment studies. Methods: Experts in social cognition were invited to nominate key domains of social cognition and the best measures of those domains. Nominations for measures were reduced according to set criteria, and all available psychometric information about these measures was summarized and provided to RAND panelists. Panelists rated the quality of each measure on multiple criteria, and diverging ratings were discussed at the in-person meeting to obtain consensus. Results: Expert surveys identified 4 core domains of social cognition—emotion processing, social perception, theory of mind/mental state attribution, and attributional style/bias. Using RAND panel consensus ratings, the following measures were selected for further evaluation: Ambiguous Intentions Hostility Questionnaire, Bell Lysaker Emotion Recognition Task, Penn Emotion Recognition Test, Relationships Across Domains, Reading the Mind in the Eyes Test, The Awareness of Social Inferences Test, Hinting Task, and Trustworthiness Task. Discussion: While it was possible to establish consensus, only a limited amount of psychometric information is currently available for the candidate measures, which underscores the need for well-validated and standardized measures in this area. PMID:23728248
Saive, Anne-Lise; Royet, Jean-Pierre; Ravel, Nadine; Thévenet, Marc; Garcia, Samuel; Plailly, Jane
2014-01-01
We behaviorally explore the link between olfaction, emotion and memory by testing the hypothesis that the emotion carried by odors facilitates the memory of specific unique events. To investigate this idea, we used a novel behavioral approach inspired by a paradigm developed by our team to study episodic memory in a controlled and as ecological as possible way in humans. The participants freely explored three unique and rich laboratory episodes; each episode consisted of three unfamiliar odors (What) positioned at three specific locations (Where) within a visual context (Which context). During the retrieval test, which occurred 24–72 h after the encoding, odors were used to trigger the retrieval of the complex episodes. The participants were proficient in recognizing the target odors among distractors and retrieving the visuospatial context in which they were encountered. The episodic nature of the task generated high and stable memory performances, which were accompanied by faster responses and slower and deeper breathing. Successful odor recognition and episodic memory were not related to differences in odor investigation at encoding. However, memory performances were influenced by the emotional content of the odors, regardless of odor valence, with both pleasant and unpleasant odors generating higher recognition and episodic retrieval than neutral odors. Finally, the present study also suggested that when the binding between the odors and the spatio-contextual features of the episode was successful, the odor recognition and the episodic retrieval collapsed into a unique memory process that began as soon as the participants smelled the odors. PMID:24936176
Xu, Lei; Ma, Xiaole; Zhao, Weihua; Luo, Lizhu; Yao, Shuxia; Kendrick, Keith M
2015-12-01
There is considerable interest in the potential therapeutic role of the neuropeptide oxytocin in altering attentional bias towards emotional social stimuli in psychiatric disorders. However, it is still unclear whether oxytocin primarily influences attention towards positive or negative valence social stimuli. Here in a double-blind, placebo controlled, between subject design experiment in 60 healthy male subjects we have used the highly sensitive dual-target rapid serial visual presentation (RSVP) paradigm to investigate whether intranasal oxytocin (40IU) treatment alters attentional bias for emotional faces. Results show that oxytocin improved recognition accuracy of neutral and happy expression faces presented in the second target position (T2) during the period of reduced attentional capacity following prior presentation of a first neutral face target (T1), but had no effect on recognition of negative expression faces (angry, fearful, sad). Oxytocin also had no effect on recognition of non-social stimuli (digits) in this task. Recognition accuracy for neutral faces at T2 was negatively associated with autism spectrum quotient (ASQ) scores in the placebo group, and oxytocin's facilitatory effects were restricted to a sub-group of subjects with higher ASQ scores. Our results therefore indicate that oxytocin primarily enhances the allocation of attentional resources towards faces expressing neutral or positive emotion and does not influence that towards negative emotion ones or non-social stimuli. This effect of oxytocin is strongest in healthy individuals with higher autistic trait scores, thereby providing further support for its potential therapeutic use in autism spectrum disorder. Copyright © 2015 Elsevier Ltd. All rights reserved.
Moberly, Aaron C; Patel, Tirth R; Castellanos, Irina
2018-02-01
As a result of their hearing loss, adults with cochlear implants (CIs) would self-report poorer executive functioning (EF) skills than normal-hearing (NH) peers, and these EF skills would be associated with performance on speech recognition tasks. EF refers to a group of high order neurocognitive skills responsible for behavioral and emotional regulation during goal-directed activity, and EF has been found to be poorer in children with CIs than their NH age-matched peers. Moreover, there is increasing evidence that neurocognitive skills, including some EF skills, contribute to the ability to recognize speech through a CI. Thirty postlingually deafened adults with CIs and 42 age-matched NH adults were enrolled. Participants and their spouses or significant others (informants) completed well-validated self-reports or informant-reports of EF, the Behavior Rating Inventory of Executive Function - Adult (BRIEF-A). CI users' speech recognition skills were assessed in quiet using several measures of sentence recognition. NH peers were tested for recognition of noise-vocoded versions of the same speech stimuli. CI users self-reported difficulty on EF tasks of shifting and task monitoring. In CI users, measures of speech recognition correlated with several self-reported EF skills. The present findings provide further evidence that neurocognitive factors, including specific EF skills, may decline in association with hearing loss, and that some of these EF skills contribute to speech processing under degraded listening conditions.
Shared mechanism for emotion processing in adolescents with and without autism
Ioannou, Christina; Zein, Marwa El; Wyart, Valentin; Scheid, Isabelle; Amsellem, Frédérique; Delorme, Richard; Chevallier, Coralie; Grèzes, Julie
2017-01-01
Although, the quest to understand emotional processing in individuals with Autism Spectrum Disorders (ASD) has led to an impressive number of studies, the picture that emerges from this research remains inconsistent. Some studies find that Typically Developing (TD) individuals outperform those with ASD in emotion recognition tasks, others find no such difference. In this paper, we move beyond focusing on potential group differences in behaviour to answer what we believe is a more pressing question: do individuals with ASD use the same mechanisms to process emotional cues? To this end, we rely on model-based analyses of participants’ accuracy during an emotion categorisation task in which displays of anger and fear are paired with direct vs. averted gaze. Behavioural data of 20 ASD and 20 TD adolescents revealed that the ASD group displayed lower overall performance. Yet, gaze direction had a similar impact on emotion categorisation in both groups, i.e. improved accuracy for salient combinations (anger-direct, fear-averted). Critically, computational modelling of participants’ behaviour reveals that the same mechanism, i.e. increased perceptual sensitivity, underlies the contextual impact of gaze in both groups. We discuss the specific experimental conditions that may favour emotion processing and the automatic integration of contextual information in ASD. PMID:28218248
Facial decoding in schizophrenia is underpinned by basic visual processing impairments.
Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric
2017-09-01
Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
fMRI of Parents of Children with Asperger Syndrome: A Pilot Study
ERIC Educational Resources Information Center
Baron-Cohen, Simon; Ring, Howard; Chitnis, Xavier; Wheelwright, Sally; Gregory, Lloyd, Williams, Steve; Brammer, Mick; Bullmore, Ed
2006-01-01
Background: People with autism or Asperger Syndrome (AS) show altered patterns of brain activity during visual search and emotion recognition tasks. Autism and AS are genetic conditions and parents may show the "broader autism phenotype." Aims: (1) To test if parents of children with AS show atypical brain activity during a visual search…
Mineralocorticoid receptor haplotype, estradiol, progesterone and emotional information processing.
Hamstra, Danielle A; de Kloet, E Ronald; Quataert, Ina; Jansen, Myrthe; Van der Does, Willem
2017-02-01
Carriers of MR-haplotype 1 and 3 (GA/CG; rs5522 and rs2070951) are more sensitive to the influence of oral contraceptives (OC) and menstrual cycle phase on emotional information processing than MR-haplotype 2 (CA) carriers. We investigated whether this effect is associated with estradiol (E2) and/or progesterone (P4) levels. Healthy MR-genotyped premenopausal women were tested twice in a counterbalanced design. Naturally cycling (NC) women were tested in the early-follicular and mid-luteal phase and OC-users during OC-intake and in the pill-free week. At both sessions E2 and P4 were assessed in saliva. Tests included implicit and explicit positive and negative affect, attentional blink accuracy, emotional memory, emotion recognition, and risky decision-making (gambling). MR-haplotype 2 homozygotes had higher implicit happiness scores than MR-haplotype 2 heterozygotes (p=0.031) and MR-haplotype 1/3 carriers (p<0.001). MR-haplotype 2 homozygotes also had longer reaction times to happy faces in an emotion recognition test than MR-haplotype 1/3 (p=0.001). Practice effects were observed for most measures. The pattern of correlations between information processing and P4 or E2 differed between sessions, as well as the moderating effects of the MR genotype. In the first session the MR-genotype moderated the influence of P4 on implicit anxiety (sr=-0.30; p=0.005): higher P4 was associated with reduction in implicit anxiety, but only in MR-haplotype 2 homozygotes (sr=-0.61; p=0.012). In the second session the MR-genotype moderated the influence of E2 on the recognition of facial expressions of happiness (sr=-0.21; p=0.035): only in MR-haplotype 1/3 higher E2 was correlated with happiness recognition (sr=0.29; p=0.005). In the second session higher E2 and P4 were negatively correlated with accuracy in lag2 trials of the attentional blink task (p<0.001). Thus NC women, compared to OC-users, performed worse on lag 2 trials (p=0.041). The higher implicit happiness scores of MR-haplotype 2 homozygotes are in line with previous reports. Performance in the attentional blink task may be influenced by OC-use. The MR-genotype moderates the influence of E2 and P4 on emotional information processing. This moderating effect may depend on the novelty of the situation. Copyright © 2016 Elsevier Ltd. All rights reserved.
No Influence of Positive Emotion on Orbitofrontal Reality Filtering: Relevance for Confabulation
Liverani, Maria Chiara; Manuel, Aurélie L.; Guggisberg, Adrian G.; Nahum, Louis; Schnider, Armin
2016-01-01
Orbitofrontal reality filtering (ORFi) is a mechanism that allows us to keep thought and behavior in phase with reality. Its failure induces reality confusion with confabulation and disorientation. Confabulations have been claimed to have a positive emotional bias, suggesting that they emanate from a tendency to embellish the situation of a handicap. Here we tested the influence of positive emotion on ORFi in healthy subjects using a paradigm validated in reality confusing patients and with a known electrophysiological signature, a frontal positivity at 200–300 ms after memory evocation. Subjects made two continuous recognition tasks (“two runs”), composed of the same set of neutral and positive pictures, but arranged in different order. In both runs, participants had to indicate picture repetitions within, and only within, the ongoing run. The first run measures learning and recognition. The second run, where all items are familiar, requires ORFi to avoid false positive responses. High-density evoked potentials were recorded from 19 healthy subjects during completion of the task. Performance was more accurate and faster on neutral than positive pictures in both runs and for all conditions. Evoked potential correlates of emotion and reality filtering occurred at 260–350 ms but dissociated in terms of amplitude and topography. In both runs, positive stimuli evoked a more negative frontal potential than neutral ones. In the second run, the frontal positivity characteristic of reality filtering was separately, and to the same degree, expressed for positive and neutral stimuli. We conclude that ORFi, the ability to place oneself correctly in time and space, is not influenced by emotional positivity of the processed material. PMID:27303276
Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.
Chung, Joanne M; Robins, Richard W
2015-01-01
Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.
Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions
Chung, Joanne M.; Robins, Richard W.
2015-01-01
Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values. PMID:26309215
Recognition of facial emotions in neuropsychiatric disorders.
Kohler, Christian G; Turner, Travis H; Gur, Raquel E; Gur, Ruben C
2004-04-01
Recognition of facial emotions represents an important aspect of interpersonal communication and is governed by select neural substrates. We present data on emotion recognition in healthy young adults utilizing a novel set of color photographs of evoked universal emotions. In addition, we review the recent literature on emotion recognition in psychiatric and neurologic disorders, and studies that compare different disorders.
Rosenberg, Hannah; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick; McDonald, Skye
2015-07-01
Traumatic brain injury (TBI) impairs emotion perception. Perception of negative emotions (sadness, disgust, fear, and anger) is reportedly affected more than positive (happiness and surprise) ones. It has been argued that this reflects a specialized neural network underpinning negative emotions that is vulnerable to brain injury. However, studies typically do not equate for differential difficulty between emotions. We aimed to examine whether emotion recognition deficits in people with TBI were specific to negative emotions, while equating task difficulty, and to determine whether perception deficits might be accounted for by other cognitive processes. Twenty-seven people with TBI and 28 matched control participants identified 6 basic emotions at 2 levels of intensity (a) the conventional 100% intensity and (b) "equated intensity"-that is, an intensity that yielded comparable accuracy rates across emotions in controls. (a) At 100% intensity, the TBI group was impaired in recognizing anger, fear, and disgust but not happiness, surprise, or sadness and performed worse on negative than positive emotions. (b) At equated intensity, the TBI group was poorer than controls overall but not differentially poorer in recognizing negative emotions. Although processing speed and nonverbal reasoning were associated with emotion accuracy, injury severity by itself was a unique predictor. When task difficulty is taken into account, individuals with TBI show impairment in recognizing all facial emotions. There was no evidence for a specific impairment for negative emotions or any particular emotion. Impairment was accounted for by injury severity rather than being a secondary effect of reduced neuropsychological functioning. (c) 2015 APA, all rights reserved).
de Souza, Leonardo Cruz; Bertoux, Maxime; de Faria, Ângelo Ribeiro Vaz; Corgosinho, Laiane Tábata Souza; Prado, Ana Carolina de Almeida; Barbosa, Izabela Guimarães; Caramelli, Paulo; Colosimo, Enrico; Teixeira, Antônio Lúcio
2018-05-25
ABSTRACTBackground:Social cognition tasks, such as identification of emotions, can contribute to the diagnosis of neuropsychiatric disorders. The wide use of Facial Emotion Recognition Test (FERT) is hampered by the absence of normative dataset and by the limited understanding of how demographic factors such as age, education, gender, and cultural background may influence the performance on the test. We analyzed the influence of these variables in the performance in the FERT from the short version of the Social and Emotional Assessment. This task is composed by 35 pictures with 7 different emotions presented 5 times each. Cognitively healthy Brazilian participants (n = 203; 109 females and 94 males) underwent the FERT. We compared the performance of participants across gender, age, and educational subgroups. We also compared the performance of Brazilians with a group of French subjects (n = 60) matched for gender, age, and educational level. There was no gender difference regarding the performance on total score and in each emotion subscore in the Brazilian sample. We found a significant effect of aging and schooling on the performance on the FERT, with younger and more educated subjects having higher scores. Brazilian and French participants did not differ in the FERT and its subscores. Normative data for employing the FERT in Brazilian population is presented. Data here provided may contribute to the interpretation of the results of FERT in different cultural contexts and highlight the common bias that should be corrected in the future tasks to be developed.
Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights
Liu, Taosheng; Pinheiro, Ana; Zhao, Zhongxin; Nestor, Paul G.; McCarley, Robert W.; Niznikiewicz, Margaret A.
2012-01-01
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region. PMID:22383987
Abdeltawwab, Mohamed M; Khater, Ahmed; El-Anwar, Mohammad W
2016-01-01
The combination of acoustic and electric stimulation as a way to enhance speech recognition performance in cochlear implant (CI) users has generated considerable interest in the recent years. The purpose of this study was to evaluate the bimodal advantage of the FS4 speech processing strategy in combination with hearing aids (HA) as a means to improve low-frequency resolution in CI patients. Nineteen postlingual CI adults were selected to participate in this study. All patients wore implants on one side and HA on the contralateral side with residual hearing. Monosyllabic word recognition, speech in noise, and emotion and talker identification were assessed using CI with fine structure processing/FS4 and high-definition continuous interleaved sampling strategies, HA alone, and a combination of CI and HA. The bimodal stimulation showed improvement in speech performance and emotion identification for the question/statement/order tasks, which was statistically significant compared to patients with CI alone, but there were no significant statistical differences in intragender talker discrimination and emotion identification for the happy/angry/neutral tasks. The poorest performance was obtained with HA only, and it was statistically significant compared to the other modalities. The bimodal stimulation showed enhanced speech performance in CI patients, and it improves the limitations provided by electric or acoustic stimulation alone. © 2016 S. Karger AG, Basel.
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque
2018-01-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human–Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam. PMID:29389845
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality.
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque; Javaid, Ahmad Y
2018-02-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.
Contextual Social Cognition Impairments in Schizophrenia and Bipolar Disorder
Villarin, Lilian; Theil, Donna; Gonzalez-Gadea, María Luz; Gomez, Pedro; Mosquera, Marcela; Huepe, David; Strejilevich, Sergio; Vigliecca, Nora Silvana; Matthäus, Franziska; Decety, Jean; Manes, Facundo; Ibañez, Agustín M.
2013-01-01
Background The ability to integrate contextual information with social cues to generate social meaning is a key aspect of social cognition. It is widely accepted that patients with schizophrenia and bipolar disorders have deficits in social cognition; however, previous studies on these disorders did not use tasks that replicate everyday situations. Methodology/Principal Findings This study evaluates the performance of patients with schizophrenia and bipolar disorders on social cognition tasks (emotional processing, empathy, and social norms knowledge) that incorporate different levels of contextual dependence and involvement of real-life scenarios. Furthermore, we explored the association between social cognition measures, clinical symptoms and executive functions. Using a logistic regression analysis, we explored whether the involvement of more basic skills in emotional processing predicted performance on empathy tasks. The results showed that both patient groups exhibited deficits in social cognition tasks with greater context sensitivity and involvement of real-life scenarios. These deficits were more severe in schizophrenic than in bipolar patients. Patients did not differ from controls in tasks involving explicit knowledge. Moreover, schizophrenic patients’ depression levels were negatively correlated with performance on empathy tasks. Conclusions/Significance Overall performance on emotion recognition predicted performance on intentionality attribution during the more ambiguous situations of the empathy task. These results suggest that social cognition deficits could be related to a general impairment in the capacity to implicitly integrate contextual cues. Important implications for the assessment and treatment of individuals with schizophrenia and bipolar disorders, as well as for neurocognitive models of these pathologies are discussed. PMID:23520477
[Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].
Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel
2016-07-01
Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.
Murphy, Nora A; Isaacowitz, Derek M
2008-06-01
The authors conducted a meta-analysis to determine the magnitude of older and younger adults' preferences for emotional stimuli in studies of attention and memory. Analyses involved 1,085 older adults from 37 independent samples and 3,150 younger adults from 86 independent samples. Both age groups exhibited small to medium emotion salience effects (i.e., preference for emotionally valenced stimuli over neutral stimuli) as well as positivity preferences (i.e., preference for positively valenced stimuli over neutral stimuli) and negativity preferences (i.e., preference for negatively valenced stimuli to neutral stimuli). There were few age differences overall. Type of measurement appeared to influence the magnitude of effects; recognition studies indicated significant age effects, where older adults showed smaller effects for emotion salience and negativity preferences than younger adults.
Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.
Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi
2012-12-01
We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Empathy and empathy induced prosocial behavior in 6- and 7-year-olds with autism spectrum disorder.
Deschamps, Peter K H; Been, Marieke; Matthys, Walter
2014-07-01
The present study aimed to assess empathy and prosocial behavior in 6-7 year old children with autism spectrum disorders (ASDs). Results showed, first, lower levels of parent- and teacher-rated cognitive empathy, and similar levels of affective empathy in children with ASD compared to typically developing (TD) children. Second, emotion recognition for basic emotions, one aspect of cognitive empathy, in a story task was adequate in ASD children, but ASD children with severe impairments in social responsiveness had difficulties in recognizing fear. Third, prosocial behavior in response to signals of distress of a peer in a computer task was similar in ASD as in TD children. In conclusion, early elementary school children with ASD show specific impairments in cognitive empathy.
Acute fluoxetine modulates emotional processing in young adult volunteers.
Capitão, L P; Murphy, S E; Browning, M; Cowen, P J; Harmer, C J
2015-08-01
Fluoxetine is generally regarded as the first-line pharmacological treatment for young people, as it is believed to show a more favourable benefit:risk ratio than other antidepressants. However, the mechanisms through which fluoxetine influences symptoms in youth have been little investigated. This study examined whether acute administration of fluoxetine in a sample of young healthy adults altered the processing of affective information, including positive, sad and anger cues. A total of 35 male and female volunteers aged between 18 and 21 years old were randomized to receive a single 20 mg dose of fluoxetine or placebo. At 6 h after administration, participants completed a facial expression recognition task, an emotion-potentiated startle task, an attentional dot-probe task and the Rapid Serial Visual Presentation. Subjective ratings of mood, anxiety and side effects were also taken pre- and post-fluoxetine/placebo administration. Relative to placebo-treated participants, participants receiving fluoxetine were less accurate at identifying anger and sadness and did not show the emotion-potentiated startle effect. There were no overall significant effects of fluoxetine on subjective ratings of mood. Fluoxetine can modulate emotional processing after a single dose in young adults. This pattern of effects suggests a potential cognitive mechanism for the greater benefit:risk ratio of fluoxetine in adolescent patients.
Age-related differences in emotion recognition ability: a cross-sectional study.
Mill, Aire; Allik, Jüri; Realo, Anu; Valk, Raivo
2009-10-01
Experimental studies indicate that recognition of emotions, particularly negative emotions, decreases with age. However, there is no consensus at which age the decrease in emotion recognition begins, how selective this is to negative emotions, and whether this applies to both facial and vocal expression. In the current cross-sectional study, 607 participants ranging in age from 18 to 84 years (mean age = 32.6 +/- 14.9 years) were asked to recognize emotions expressed either facially or vocally. In general, older participants were found to be less accurate at recognizing emotions, with the most distinctive age difference pertaining to a certain group of negative emotions. Both modalities revealed an age-related decline in the recognition of sadness and -- to a lesser degree -- anger, starting at about 30 years of age. Although age-related differences in the recognition of expression of emotion were not mediated by personality traits, 2 of the Big 5 traits, openness and conscientiousness, made an independent contribution to emotion-recognition performance. Implications of age-related differences in facial and vocal emotion expression and early onset of the selective decrease in emotion recognition are discussed in terms of previous findings and relevant theoretical models.
Yang, Lixia; Truong, Linda; Fuss, Samantha; Bislimovic, Sanja
2012-01-01
The self-reference effect (SRE) is a powerful memory advantage associated with encoding in reference to the self (e.g., Rogers, Kuiper, & Kirker, 1977). To explore whether this mnemonic benefit occurs spontaneously, the current study assessed how ageing and divided attention affect the magnitude of the SRE in emotional memory (i.e., memory for emotional stimuli). The sample included a young Full Attention group (young-FA), a young Divided Attention group (young-DA), and an older adult group. The division of attention was manipulated at encoding where participants incidentally studied positive, negative, and neutral trait adjectives in either a self-reference (i.e., rating how well each word describes themselves) or an other-reference condition (i.e., rating how well each word describes another person). Memory for these words was assessed with both recall and recognition tasks. The results from both tasks demonstrated equivalent SRE for all three groups across emotional valence categories of stimuli, suggesting that the SRE is a spontaneous, effortless, and robust effect in memory.