Recently, research on the lateralization of linguistic and nonlinguistic (emotional) prosody has experienced a revival. However, both neuroimaging and patient evidence do not draw a coherent picture substantiating right-hemispheric lateralization of prosody and emotional prosody in particular. The current overview summarizes positions and data on the lateralization of emotion and emotional prosodic processing in the brain and proposes that: (1) the realization of emotional prosodic processing in the brain is based on differentially lateralized subprocesses and (2) methodological factors can influence the lateralization of emotional prosody in neuroimaging investigations. Latter evidence reveals that emotional valence effects are strongly right lateralized in studies using compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional prosodic valence. These findings suggest a strong interaction between language and emotional prosodic processing.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/S0079-6123(06)56015-7 | DOI Listing |
Cureus
December 2024
Department of Neurosurgery, University of Tsukuba Hospital, Tsukuba, JPN.
Dysprosody affects rhythm and intonation in speech, resulting in the impairment of emotional or attitude expression, and usually presents as a negative symptom resulting in a monotonous tone. We herein report a rare case of recurrent glioblastoma (GBM) with dysprosody featuring sing-song speech. A 68-year-old man, formerly left-handed, with right temporal GBM underwent gross total resection.
View Article and Find Full Text PDFJ Commun Disord
January 2025
School of Foreign Studies, China University of Petroleum (East China), Qingdao, China. Electronic address:
Introduction: It is still under debate whether and how semantic content will modulate the emotional prosody perception in children with autism spectrum disorder (ASD). The current study aimed to investigate the issue using two experiments by systematically manipulating semantic information in Chinese disyllabic words.
Method: The present study explored the potential modulation of semantic content complexity on emotional prosody perception in Mandarin-speaking children with ASD.
Emotion
January 2025
Department of Psychology, Cognitive and Affective Neuroscience Unit, University of Zurich.
Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication.
View Article and Find Full Text PDFJ Child Lang
January 2025
ELTE-HUN-REN NAP Comparative Ethology research group, Research Centre for Natural Sciences, Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary.
By comparing infant-directed speech to spouse- and dog-directed talk, we aimed to investigate how pitch and utterance length are modulated by speakers considering the speech context and the partner's expected needs and capabilities. We found that mean pitch was modulated in line with the partner's attentional needs, while pitch range and utterance length were modulated according to the partner's expected linguistic competence. In a situation with a nursery rhyme, speakers used the highest pitch and widest pitch range with all partners suggesting that infant-directed context greatly influences these acoustic features.
View Article and Find Full Text PDFBrain Sci
November 2024
Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA.
Background/objectives: Emotional prosody, the intonation and rhythm of speech that conveys emotions, is vital for speech communication as it provides essential context and nuance to the words being spoken. This study explored how listeners automatically process emotional prosody in speech, focusing on different neural responses for the prosodic categories and potential sex differences.
Methods: The pilot data here involved 11 male and 11 female adult participants (age range: 18-28).
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!