Sensorimotor models suggest that understanding the emotional content of a face recruits a simulation process in which a viewer partially reproduces the facial expression in their own sensorimotor system. An important prediction of these models is that disrupting simulation should make emotion recognition more difficult. Here we used electroencephalogram (EEG) and facial electromyogram (EMG) to investigate how interfering with sensorimotor signals from the face influences the real-time processing of emotional faces. EEG and EMG were recorded as healthy adults viewed emotional faces and rated their valence. During control blocks, participants held a conjoined pair of chopsticks loosely between their lips. During interference blocks, participants held the chopsticks horizontally between their teeth and lips to generate motor noise on the lower part of the face. This noise was confirmed by EMG at the zygomaticus. Analysis of EEG indicated that faces expressing happiness or disgust-lower face expressions-elicited larger amplitude N400 when they were presented during the interference than the control blocks, suggesting interference led to greater semantic retrieval demands. The selective impact of facial motor interference on the brain response to lower face expressions supports sensorimotor models of emotion understanding.

Download full-text PDF

Source
http://dx.doi.org/10.3758/s13415-017-0503-2DOI Listing

Publication Analysis

Top Keywords

simulation emotion
8
semantic retrieval
8
retrieval demands
8
sensorimotor models
8
emotional faces
8
control blocks
8
blocks participants
8
participants held
8
lower face
8
sensorimotor
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!