Sadness is unique: neural processing of emotions in speech prosody in musicians and non-musicians.

Front Hum Neurosci

Institute of Medical Psychology, Ludwig-Maximilians-Universität Munich, Germany ; Human Science Center, Ludwig-Maximilians-Universität Munich, Germany ; Parmenides Center for Art and Science Pullach, Germany ; Department of Psychology and Key Laboratory of Machine Perception (MoE), Peking University Beijing, China.

Published: February 2015

Musical training has been shown to have positive effects on several aspects of speech processing, however, the effects of musical training on the neural processing of speech prosody conveying distinct emotions are yet to be better understood. We used functional magnetic resonance imaging (fMRI) to investigate whether the neural responses to speech prosody conveying happiness, sadness, and fear differ between musicians and non-musicians. Differences in processing of emotional speech prosody between the two groups were only observed when sadness was expressed. Musicians showed increased activation in the middle frontal gyrus, the anterior medial prefrontal cortex, the posterior cingulate cortex and the retrosplenial cortex. Our results suggest an increased sensitivity of emotional processing in musicians with respect to sadness expressed in speech, possibly reflecting empathic processes.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4311618PMC
http://dx.doi.org/10.3389/fnhum.2014.01049DOI Listing

Publication Analysis

Top Keywords

speech prosody
16
neural processing
8
musicians non-musicians
8
musical training
8
prosody conveying
8
sadness expressed
8
speech
6
processing
5
sadness
4
sadness unique
4

Similar Publications

Humans rarely speak without producing co-speech gestures of the hands, head, and other parts of the body. Co-speech gestures are also highly restricted in how they are timed with speech, typically synchronizing with prosodically-prominent syllables. What functional principles underlie this relationship? Here, we examine how the production of co-speech manual gestures influences spatiotemporal patterns of the oral articulators during speech production.

View Article and Find Full Text PDF

Comprehension of acoustically degraded emotional prosody in Alzheimer's disease and primary progressive aphasia.

Sci Rep

December 2024

Dementia Research Centre, Department of Neurodegenerative Disease, UCL Queen Square Institute of Neurology, University College London, 1st Floor, 8-11 Queen Square, London, WC1N 3AR, UK.

Previous research suggests that emotional prosody perception is impaired in neurodegenerative diseases like Alzheimer's disease (AD) and primary progressive aphasia (PPA). However, no previous research has investigated emotional prosody perception in these diseases under non-ideal listening conditions. We recruited 18 patients with AD, and 31 with PPA (nine logopenic (lvPPA); 11 nonfluent/agrammatic (nfvPPA) and 11 semantic (svPPA)), together with 24 healthy age-matched individuals.

View Article and Find Full Text PDF

Beat gestures and prosodic prominence interactively influence language comprehension.

Cognition

December 2024

Max Plank Institute for Psycholinguistics, Wundtlaan 1, 6525 XD Nijmegen, The Netherlands; Radboud University Nijmegen, Donders Institute for Brain, Cognition and Behaviour, 6525 EN Nijmegen, The Netherlands.

Face-to-face communication is not only about 'what' is said but also 'how' it is said, both in speech and bodily signals. Beat gestures are rhythmic hand movements that typically accompany prosodic prominence in conversation. Yet, it is still unclear how beat gestures influence language comprehension.

View Article and Find Full Text PDF

Background: Changes in voice are a symptom of Parkinson's disease and used to assess the progression of the condition. However, natural differences in the voices of people can make this challenging. Computerized binary speech classification can identify people with PD (PwPD), but its multiclass application to detect the severity of the disease remains difficult.

View Article and Find Full Text PDF

Concurrent processing of the prosodic hierarchy is supported by cortical entrainment and phase-amplitude coupling.

Cereb Cortex

December 2024

Institute for the Interdisciplinary Study of Language Evolution, University of Zurich, Affolternstrasse 56, 8050 Zürich, Switzerland.

Models of phonology posit a hierarchy of prosodic units that is relatively independent from syntactic structure, requiring its own parsing. It remains unexplored how this prosodic hierarchy is represented in the brain. We investigated this foundational question by means of an electroencephalography (EEG) study.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!