We appraise other people's emotions by combining multiple sources of information, including somatic facial/body reactions and the surrounding context. Wealthy literature revealed how people take into account contextual information in the interpretation of facial expressions, but the mechanisms mediating such influence still need to be duly investigated. Across two experiments, we mapped the neural representations of distinct (but comparably unpleasant) negative states, pain, and disgust, as conveyed by naturalistic facial expressions or contextual sentences. Negative expressions led to shared activity in the fusiform gyrus and superior temporal sulcus. Instead, pain contexts recruited the supramarginal, postcentral, and insular cortex, whereas disgust contexts triggered the temporoparietal cortex and hippocampus/amygdala. When pairing the two sources of information together, we found a higher likelihood of classifying an expression according to the sentence preceding it. Furthermore, networks specifically involved in processing contexts were re-enacted whenever a face followed said context. Finally, the perigenual medial prefrontal cortex (mPFC) showed increased activity for consistent (vs inconsistent) face-context pairings, suggesting that it integrates state-specific information from the two sources. Overall, our study reveals the heterogeneous nature of face-context information integration, which operates both according to a state-general and state-specific principle, with the latter mediated by the perigenual medial prefrontal cortex.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1523/JNEUROSCI.2233-23.2024 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!