Projects

Stimuli Development: The ‘Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations’

We developed videos displaying faces expressing a wide range of emotions. For this project, we based ourselves on the work of the Amsterdam Interdisciplinary Centre for Emotion, and adapted their videos (the Amsterdam Dynamic Facial Expression Set; ADFES) to represent expression intensities (Bath Intensity Variations; BIV). The ADFES-BIV includes 12 (7 male, 5 female) models displaying the emotions: anger, disgust, fear, sadness, surprise, happiness, pride, contempt, embarrassment, and neutral. The ADFES-BIV has the advantage that it comprises videos and therefore entails the dynamic nature of facial expressions. As facial emotional expressions vary in their intensity, the ADFES-BIV further includes expression intensity variations (low, intermediate, and high). There are also 10 videos of an additional model, which can be used for practice trials. The total number of videos is 370. The videos are all 1 second long. More information on the development and validation of the videos can be found here in the open-access publication. The ADFES-BIV is available free of charge for research purposes. Please contact us via email if you are interested in using the ADFES-BIV in your research.

Facial Emotion Recognition Research

Applying the ADFES-BIV, we found that female participants outperformed males in accuracy and speed at facial emotion recognition and this advantage was visible across the three intensity levels. It is possible that the female advantage is rooted in biology due to women’s child rearing or that the way women are socialised is encouraging facial emotion recognition. To read the full paper, which is publicly accessible, click here.

Looking at facial emotion recognition in adolescents and young adults with a diagnosis of an autism spectrum condition compared to controls, controls outperformed individuals with autism across the three expression intensity levels. In inspecting patterns of confusions between emotion categories, we found that the confusions at high expression intensity were the same in both groups but amplified in the autism group. However, at low expression intensity, the autism group perceived emotional expressions as neutral much more than controls. These results suggest that there are two different processes underlying the observed facial emotion recognition deficit, one related to specificity and one to sensitivity. The article was published here.

Facial Expression Research

We showed that the way facial muscles are engaged can have an effect on people’s ability to interpret the facial emotional expressions of others. Precisely, when there is anatomical incongruence between one’s own facial expression and the observed expression, then people are less accurate in identifying the other person’s facial emotional expressions. The respective article is freely accessible here.

When examining participants’ facial emotional expressions, we apply a more comprehensive approach and consider the holistic interplay of movements in the face that comprise facial expressions. In doing so, we were able to show that people mimic facial emotional expressions they observe and do so across the entire face. We saw a subtle mimicry response for a wide range of facial emotional expressions. To read the full paper, which is publicly accessible, click here.

More information on how facial muscles are related to emotion including theoretical as well as methodological considerations can be found here.