In human communication there is often a close relationship between the perception of an emotionally expressive face and the facial response of the viewer himself. Whereas perception and generation of facial expressions have been studied separately with functional imaging methods, no studies exist on their interaction. We combined the presentation of emotionally expressive faces with the instruction to react with facial movements predetermined and assigned. fMRI was used in an event related design to examine healthy subjects while they regarded happy, sad, or neutral faces and were instructed to simultaneously move the corners of their mouths either (a). upwards or (b). downwards, or (c). to refrain from movement. The subjects' facial movements were recorded with an MR-compatible video camera. Movement latencies were shortened in congruent situations (e.g. the presentation of a happy face and combined with upward movements) and delayed in non-congruent situations. Dissonant more than congruent stimuli activated the inferior prefrontal cortex and the somatomotor cortex bilaterally. The congruent condition, in particular when seeing a happy face, activated the medial basotemporal lobes (hippocampus, amygdala, parahippocampal region). We hypothesize that this region facilitates congruent facial movements when an emotionally expressive face is perceived and that it is part of a system for non-volitional emotional facial movements.

Download full-text PDF

Source
http://dx.doi.org/10.1016/s0925-4927(03)00006-4DOI Listing

Publication Analysis

Top Keywords

facial movements
20
emotionally expressive
12
facial
8
expressive face
8
happy face
8
movements
6
smiles contagious?
4
contagious? fmri
4
fmri study
4
study interaction
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!