Recently, research on the lateralization of linguistic and nonlinguistic (emotional) prosody has experienced a revival. However, both neuroimaging and patient evidence do not draw a coherent picture substantiating right-hemispheric lateralization of prosody and emotional prosody in particular. The current overview summarizes positions and data on the lateralization of emotion and emotional prosodic processing in the brain and proposes that: (1) the realization of emotional prosodic processing in the brain is based on differentially lateralized subprocesses and (2) methodological factors can influence the lateralization of emotional prosody in neuroimaging investigations. Latter evidence reveals that emotional valence effects are strongly right lateralized in studies using compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional prosodic valence. These findings suggest a strong interaction between language and emotional prosodic processing.

Download full-text PDF

Source
http://dx.doi.org/10.1016/S0079-6123(06)56015-7DOI Listing

Publication Analysis

Top Keywords

emotional prosody
16
emotional prosodic
16
lateralization emotional
12
prosodic processing
12
emotional
9
processing brain
8
lateralization
6
prosody
5
prosody brain
4
brain overview
4

Similar Publications

Dysprosody affects rhythm and intonation in speech, resulting in the impairment of emotional or attitude expression, and usually presents as a negative symptom resulting in a monotonous tone. We herein report a rare case of recurrent glioblastoma (GBM) with dysprosody featuring sing-song speech. A 68-year-old man, formerly left-handed, with right temporal GBM underwent gross total resection.

View Article and Find Full Text PDF

Introduction: It is still under debate whether and how semantic content will modulate the emotional prosody perception in children with autism spectrum disorder (ASD). The current study aimed to investigate the issue using two experiments by systematically manipulating semantic information in Chinese disyllabic words.

Method: The present study explored the potential modulation of semantic content complexity on emotional prosody perception in Mandarin-speaking children with ASD.

View Article and Find Full Text PDF

Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication.

View Article and Find Full Text PDF

By comparing infant-directed speech to spouse- and dog-directed talk, we aimed to investigate how pitch and utterance length are modulated by speakers considering the speech context and the partner's expected needs and capabilities. We found that mean pitch was modulated in line with the partner's attentional needs, while pitch range and utterance length were modulated according to the partner's expected linguistic competence. In a situation with a nursery rhyme, speakers used the highest pitch and widest pitch range with all partners suggesting that infant-directed context greatly influences these acoustic features.

View Article and Find Full Text PDF

Sex Differences in Processing Emotional Speech Prosody: Preliminary Findings from a Multi-Feature Oddball Study.

Brain Sci

November 2024

Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA.

Background/objectives: Emotional prosody, the intonation and rhythm of speech that conveys emotions, is vital for speech communication as it provides essential context and nuance to the words being spoken. This study explored how listeners automatically process emotional prosody in speech, focusing on different neural responses for the prosodic categories and potential sex differences.

Methods: The pilot data here involved 11 male and 11 female adult participants (age range: 18-28).

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!