Publications by authors named "Tanja S H Wingenbach"

9 Publications

  • Page 1 of 1

Perception of Discrete Emotions in Others: Evidence for Distinct Facial Mimicry Patterns.

Sci Rep 2020 03 13;10(1):4692. Epub 2020 Mar 13.

Centre for Applied Autism Research, Department of Psychology, University of Bath, Bath, UK.

Covert facial mimicry involves subtle facial muscle activation in observers when they perceive the facial emotional expressions of others. It remains uncertain whether prototypical facial features in emotional expressions are being covertly mimicked and also whether covert facial mimicry involves distinct facial muscle activation patterns across muscles per emotion category, or simply distinguishes positive versus negative valence in observed facial emotions. To test whether covert facial mimicry is emotion-specific, we measured facial electromyography (EMG) from five muscle sites (corrugator supercilii, levator labii, frontalis lateralis, depressor anguli oris, zygomaticus major) whilst participants watched videos of people expressing 9 different basic and complex emotions and a neutral expression. This study builds upon previous research by including a greater number of facial muscle measures and emotional expressions. It is the first study to investigate activation patterns across muscles during facial mimicry and to provide evidence for distinct patterns of facial muscle activation when viewing individual emotion categories, suggesting that facial mimicry is emotion-specific, rather than just valence-based.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-020-61563-5DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7069962PMC
March 2020

Evaluations of affective stimuli modulated by another person's presence and affiliative touch.

Emotion 2021 Mar 14;21(2):360-375. Epub 2019 Nov 14.

Social and Cognitive Neuroscience Laboratory.

Affiliative touch carries affective meaning and affects the receiver. Although research demonstrates that receiving touch modulates the neural processing of emotions, its effects on evaluations of affective stimuli remain unexplored. The current research examined the effects of affiliative touch on the evaluation of affective images across 3 studies and aimed to disentangle the effect of another person's mere presence from the addition of affiliative touch. Participants thus underwent experimental conditions of social manipulation (presence, alone) and touch manipulations (receiving, self-providing, providing to experimenter) while viewing affective images (negative, neutral, and positive valence) and evaluated their valence. Study 1 included hand-squeezing (N = 39), and Study 2 included forearm-stroking (N = 40) in a within-subjects design. Study 3 included hand-squeezing (N = 109) in a between-subjects design. Across both studies, the results suggested that the receiving condition decreased the negativity of negative images, and the providing condition reduced the positivity of positive images. Furthermore, the other presence condition increased the positivity of positive images compared with the alone condition in Study 1 and to the receiving condition in Study 2. Hand-squeezing and forearm-stroking had differential effects on affective image evaluations depending on the image valence and who provided the touch. Overall, receiving touch seems to attenuate negative evaluations in negative contexts and the presence of others amplifies positive evaluations in positive situations. Discussion highlights the importance of affiliative touch within social interactions. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1037/emo0000700DOI Listing
March 2021

Facial mimicry, facial emotion recognition and alexithymia in post-traumatic stress disorder.

Behav Res Ther 2019 11 4;122:103436. Epub 2019 Jul 4.

University Hospital Zurich, Department of Consultation-Liaison Psychiatry and Psychosomatic Medicine, University of Zurich, Switzerland. Electronic address:

Individuals with post-traumatic stress disorder (PTSD) show abnormalities in higher-order emotional processes, including emotion regulation and recognition. However, automatic facial responses to observed facial emotion (facial mimicry) has not yet been investigated in PTSD. Furthermore, whereas deficits in facial emotion recognition have been reported, little is known about contributing factors. We thus investigated facial mimicry and potential effects of alexithymia and expressive suppression on facial emotion recognition in PTSD. Thirty-eight PTSD participants, 43 traumatized and 33 non-traumatized healthy controls completed questionnaires assessing alexithymia and expressive suppression. Facial electromyography was measured from the muscles zygomaticus major and corrugator supercilii during a facial emotion recognition task. Corrugator activity was increased in response to negative emotional expressions compared to zygomaticus activity and vice versa for positive emotions, but no significant group differences emerged. Individuals with PTSD reported greater expressive suppression and alexithymia than controls, but only levels of alexithymia predicted lower recognition of negative facial expressions. While automatic facial responses to observed facial emotion seem to be intact in PTSD, alexithymia, but not expressive suppression, plays an important role in facial emotion recognition of negative emotions. If replicated, future research should evaluate whether successful interventions for alexithymia improve facial emotion recognition abilities.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.brat.2019.103436DOI Listing
November 2019

Development and Validation of Verbal Emotion Vignettes in Portuguese, English, and German.

Front Psychol 2019 14;10:1135. Epub 2019 Jun 14.

Social and Cognitive Neuroscience Laboratory, Centre for Biological and Health Sciences, Mackenzie Presbyterian University, São Paulo, Brazil.

Everyday human social interaction involves sharing experiences verbally and these experiences often include emotional content. Providing this context generally leads to the experience of emotions in the conversation partner. However, most emotion elicitation stimulus sets are based on images or film-sequences providing visual and/or auditory emotion cues. To assimilate what occurs within social interactions, the current study aimed at creating and validating verbal emotion vignettes as stimulus set to elicit emotions (anger, disgust, fear, sadness, happiness, gratitude, guilt, and neutral). Participants had to mentally immerse themselves in 40 vignettes and state which emotion they experienced next to the intensity of this emotion. The vignettes were validated on a large sample of native Portuguese-speakers ( = 229), but also on native English-speaking ( = 59), and native German-speaking ( = 50) samples to maximise applicability of the vignettes. Hierarchical cluster analyses showed that the vignettes mapped clearly on their target emotion categories in all three languages. The final stimulus sets each include 4 vignettes per emotion category plus 1 additional vignette per emotion category which can be used for task familiarisation procedures within research. The high agreement rates on the experienced emotion in combination with the medium to large intensity ratings in all three languages suggest that the stimulus sets are suitable for application in emotion research (e.g., emotion recognition or emotion elicitation).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fpsyg.2019.01135DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6587102PMC
June 2019

Incongruence Between Observers' and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli.

Front Psychol 2018 6;9:864. Epub 2018 Jun 6.

Centre for Applied Autism Research, Department of Psychology, University of Bath, Bath, United Kingdom.

According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fpsyg.2018.00864DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5997820PMC
June 2018

Impaired Recognition of Positive Emotions in Individuals with Posttraumatic Stress Disorder, Cumulative Traumatic Exposure, and Dissociation.

Psychother Psychosom 2018 1;87(2):118-120. Epub 2018 Mar 1.

Department of Psychiatry and Psychotherapy, University Hospital Zurich, University of Zurich, Zurich, Switzerland.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1159/000486342DOI Listing
October 2018

Sex differences in facial emotion recognition across varying expression intensity levels from videos.

PLoS One 2018 2;13(1):e0190634. Epub 2018 Jan 2.

Department of Psychology, University of Bath, Bath, Somerset, United Kingdom.

There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190634PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5749848PMC
February 2018

Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions.

PLoS One 2016 19;11(1):e0147112. Epub 2016 Jan 19.

Department of Psychology, University of Bath, Bath, United Kingdom.

Most of the existing sets of facial expressions of emotion contain static photographs. While increasing demand for stimuli with enhanced ecological validity in facial emotion recognition research has led to the development of video stimuli, these typically involve full-blown (apex) expressions. However, variations of intensity in emotional facial expressions occur in real life social interactions, with low intensity expressions of emotions frequently occurring. The current study therefore developed and validated a set of video stimuli portraying three levels of intensity of emotional expressions, from low to high intensity. The videos were adapted from the Amsterdam Dynamic Facial Expression Set (ADFES) and termed the Bath Intensity Variations (ADFES-BIV). A healthy sample of 92 people recruited from the University of Bath community (41 male, 51 female) completed a facial emotion recognition task including expressions of 6 basic emotions (anger, happiness, disgust, fear, surprise, sadness) and 3 complex emotions (contempt, embarrassment, pride) that were expressed at three different intensities of expression and neutral. Accuracy scores (raw and unbiased (Hu) hit rates) were calculated, as well as response times. Accuracy rates above chance level of responding were found for all emotion categories, producing an overall raw hit rate of 69% for the ADFES-BIV. The three intensity levels were validated as distinct categories, with higher accuracies and faster responses to high intensity expressions than intermediate intensity expressions, which had higher accuracies and faster responses than low intensity expressions. To further validate the intensities, a second study with standardised display times was conducted replicating this pattern. The ADFES-BIV has greater ecological validity than many other emotion stimulus sets and allows for versatile applications in emotion research. It can be retrieved free of charge for research purposes from the corresponding author.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0147112PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4718603PMC
July 2016
-->