Background: As a tonal language, Mandarin Chinese has the following pronunciation elements for each syllable: the vowel, consonant, tone, duration, and intensity. Revealing the characteristics of auditory-related cortical processing of these different pronunciation elements is interesting.
Methods: A Mandarin pronunciation multifeature paradigm was designed, during which a standard stimulus and five different phonemic deviant stimuli were presented. The electroencephalogram (EEG) data were recorded with 256-electrode high-density EEG equipment. Time-domain and source localization analyses were conducted to demonstrate waveform characteristics and locate the sources of the cortical processing of mismatch negativity (MMN) and P3a components following different stimuli.
Results: Vowel and consonant differences elicited distinct MMN and P3a components, but tone and duration differences did not. Intensity differences elicited distinct MMN components but not P3a components. For MMN and P3a components, the activated cortical areas were mainly in the frontal-temporal lobe. However, the regions and intensities of the cortical activation were significantly different among the components for the various deviant stimuli. The activated cortical areas of the MMN and P3a components elicited by vowels and consonants seemed to be larger and show more intense activation.
Conclusion: The auditory processing centers use different auditory-related cognitive resources when processing different Mandarin pronunciation elements. Vowels and consonants carry more information for speech comprehension; moreover, more neurons in the cortex may be involved in the recognition and cognitive processing of these elements.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10804857 | PMC |
http://dx.doi.org/10.3389/fnins.2023.1277129 | DOI Listing |
Sci Rep
January 2025
Graduate School of Health Sciences, Hokkaido University, Sapporo, Japan.
Subjective confidence and uncertainty are closely related to cognition and behavior. However, direct evidence that subjective confidence controls attention allocation is lacking. This study aimed to clarify whether subjective confidence could be involved in controlling attention allocation and intensity.
View Article and Find Full Text PDFBrain Sci
November 2024
Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA.
Background/objectives: Emotional prosody, the intonation and rhythm of speech that conveys emotions, is vital for speech communication as it provides essential context and nuance to the words being spoken. This study explored how listeners automatically process emotional prosody in speech, focusing on different neural responses for the prosodic categories and potential sex differences.
Methods: The pilot data here involved 11 male and 11 female adult participants (age range: 18-28).
A resilience-based approach in American Indian (AI) communities focuses on inherent sociocultural assets that may act as protective resilience buffers linked to mitigated mental health risks (e.g., deep-rooted spiritual, robust social support networks).
View Article and Find Full Text PDFProg Neuropsychopharmacol Biol Psychiatry
January 2025
CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China. Electronic address:
Objective: The relationship between the duration of untreated psychosis (DUP) and cognitive function in schizophrenia (SZ) patients remains debated, with no empirical evidence from event-related potential (ERP) studies supporting their association. This study aims to investigate the relationship between DUP and cognitive functions, as well as psychiatric symptoms, in first-episode antipsychotic-naïve SZ (FEAN-SZ) patients using ERP.
Methods: The study included 321 Chinese FEAN-SZ patients and 146 healthy controls.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!