Nonmotor symptoms in Parkinson's disease (PD) involving cognition and emotionality have progressively received attention. The objective of the present study was to investigate recognition of emotional prosody in patients with PD (n = 14) in comparison to healthy control subjects (HC, n = 14). Event-related brain potentials (ERP) were recorded in a modified oddball paradigm under passive listening and active target detection instructions. Results showed a poorer performance of PD patients in classifying emotional prosody. ERP generated by emotional deviants (happy/sad) during passive listening revealed diminished amplitudes of the mismatch-related negativity for sad deviants, indicating an impairment of early preattentive processing of emotional prosody in PD.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1002/mds.21038 | DOI Listing |
Cureus
December 2024
Department of Neurosurgery, University of Tsukuba Hospital, Tsukuba, JPN.
Dysprosody affects rhythm and intonation in speech, resulting in the impairment of emotional or attitude expression, and usually presents as a negative symptom resulting in a monotonous tone. We herein report a rare case of recurrent glioblastoma (GBM) with dysprosody featuring sing-song speech. A 68-year-old man, formerly left-handed, with right temporal GBM underwent gross total resection.
View Article and Find Full Text PDFJ Commun Disord
January 2025
School of Foreign Studies, China University of Petroleum (East China), Qingdao, China. Electronic address:
Introduction: It is still under debate whether and how semantic content will modulate the emotional prosody perception in children with autism spectrum disorder (ASD). The current study aimed to investigate the issue using two experiments by systematically manipulating semantic information in Chinese disyllabic words.
Method: The present study explored the potential modulation of semantic content complexity on emotional prosody perception in Mandarin-speaking children with ASD.
Emotion
January 2025
Department of Psychology, Cognitive and Affective Neuroscience Unit, University of Zurich.
Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication.
View Article and Find Full Text PDFJ Child Lang
January 2025
ELTE-HUN-REN NAP Comparative Ethology research group, Research Centre for Natural Sciences, Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary.
By comparing infant-directed speech to spouse- and dog-directed talk, we aimed to investigate how pitch and utterance length are modulated by speakers considering the speech context and the partner's expected needs and capabilities. We found that mean pitch was modulated in line with the partner's attentional needs, while pitch range and utterance length were modulated according to the partner's expected linguistic competence. In a situation with a nursery rhyme, speakers used the highest pitch and widest pitch range with all partners suggesting that infant-directed context greatly influences these acoustic features.
View Article and Find Full Text PDFBrain Sci
November 2024
Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA.
Background/objectives: Emotional prosody, the intonation and rhythm of speech that conveys emotions, is vital for speech communication as it provides essential context and nuance to the words being spoken. This study explored how listeners automatically process emotional prosody in speech, focusing on different neural responses for the prosodic categories and potential sex differences.
Methods: The pilot data here involved 11 male and 11 female adult participants (age range: 18-28).
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!