Reading sadness beyond human faces.

Brain Res

CNRS USR 3246, CHU Pitié-Salpêtrière, Paris, France.

Published: August 2010

Human faces are the main emotion displayers. Knowing that emotional compared to neutral stimuli elicit enlarged ERPs components at the perceptual level, one may wonder whether this has led to an emotional facilitation bias toward human faces. To contribute to this question, we measured the P1 and N170 components of the ERPs elicited by human facial compared to artificial stimuli, namely non-humanoid robots. Fifteen healthy young adults were shown sad and neutral, upright and inverted expressions of human versus robotic displays. An increase in P1 amplitude in response to sad displays compared to neutral ones evidenced an early perceptual amplification for sadness information. P1 and N170 latencies were delayed in response to robotic stimuli compared to human ones, while N170 amplitude was not affected by media. Inverted human stimuli elicited a longer latency of P1 and a larger N170 amplitude while inverted robotic stimuli did not. As a whole, our results show that emotion facilitation is not biased to human faces but rather extend to non-human displays, thus suggesting our capacity to read emotion beyond faces.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.brainres.2010.05.051DOI Listing

Publication Analysis

Top Keywords

human faces
16
human
8
compared neutral
8
robotic stimuli
8
n170 amplitude
8
faces
5
stimuli
5
reading sadness
4
sadness human
4
faces human
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!