Preschoolers' real-time coordination of vocal and facial emotional information.

J Exp Child Psychol

Department of Psychology, University of Calgary, Calgary, Alberta T2N 1N4, Canada. Electronic address:

Published: February 2016

An eye-tracking methodology was used to examine the time course of 3- and 5-year-olds' ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and 5-year-olds, sad-sounding speech triggered gaze shifts to a matching (sad-looking) face from the earliest moments of speech processing. However, it was not until approximately 800ms into a happy-sounding utterance that preschoolers began to use the emotional cues from speech to identify a matching (happy-looking) face. Complementary analyses based on conscious/controlled behaviors (children's explicit points toward the faces) indicated that 5-year-olds, but not 3-year-olds, could successfully match happy-sounding and sad-sounding vocal affect to a corresponding emotional face. Together, the findings clarify developmental patterns in preschoolers' implicit versus explicit ability to coordinate emotional cues across modalities and highlight preschoolers' greater sensitivity to sad-sounding speech as the auditory signal unfolds in time.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jecp.2015.09.014DOI Listing

Publication Analysis

Top Keywords

indicated 5-year-olds
8
sad-sounding speech
8
emotional cues
8
emotional
5
speech
5
preschoolers' real-time
4
real-time coordination
4
coordination vocal
4
vocal facial
4
facial emotional
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!