The spatial distribution of eye movements predicts the (false) recognition of emotional facial expressions.

PLoS One

Laboratoire Développement, Individu, Processus, Handicap, Éducation (DIPHE), Département Psychologie du Développement, de l'Éducation et des Vulnérabilités (PsyDEV), Institut de Psychologie, Université de Lyon (Lumière Lyon 2), Lyon, France.

Published: June 2021

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7837501PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0245777PLOS

Publication Analysis

Top Keywords

facial emotions
16
facial
13
facial expressions
12
recognition facial
12
facial actions
12
spatial distribution
8
distribution eye
8
eye movements
8
movements predicts
8
recognition
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!