The complexities of how prosodic structure, both at the phrasal and syllable levels, shapes speech production have begun to be illuminated through studies of articulatory behavior. The present study contributes to an understanding of prosodic signatures on articulation by examining the joint effects of phrasal and syllable position on the production of consonants. Articulatory kinematic data were collected for five subjects using electromagnetic articulography (EMA) to record target consonants (labial, labiodental, and tongue tip), located in (1) either syllable final or initial position and (2) either at a phrase edge or phrase medially. Spatial and temporal characteristics of the consonantal constriction formation and release were determined based on kinematic landmarks in the articulator velocity profiles. The results indicate that syllable and phrasal position consistently affect the movement duration; however, effects on displacement were more variable. For most subjects, the boundary-adjacent portions of the movement (constriction release for a preboundary coda and constriction formation for a postboundary onset) are not differentially affected in terms of phrasal lengthening-both lengthen comparably.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1121/1.2130950 | DOI Listing |
J Acoust Soc Am
May 2024
Department of Linguistics, University of California, Santa Barbara, Santa Barbara, California 93106, USA.
This electromagnetic articulography study explores the kinematic profile of Intonational Phrase boundaries in Seoul Korean. Recent findings suggest that the scope of phrase-final lengthening is conditioned by word- and/or phrase-level prominence. However, evidence comes mainly from head-prominence languages, which conflate positions of word prosody with positions of phrasal prominence.
View Article and Find Full Text PDFCereb Cortex
September 2023
Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China.
Speech comprehension requires listeners to rapidly parse continuous speech into hierarchically-organized linguistic structures (i.e. syllable, word, phrase, and sentence) and entrain the neural activities to the rhythm of different linguistic levels.
View Article and Find Full Text PDFeNeuro
June 2023
Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China.
Native speakers excel at parsing continuous speech into smaller elements and entraining their neural activities to the linguistic hierarchy at different levels (e.g., syllables, phrases, and sentences) to achieve speech comprehension.
View Article and Find Full Text PDFDev Cogn Neurosci
February 2023
Laboratoire de Neuroanatomie et Neuroimagerie translationnelles, UNI - ULB Neuroscience Institute, Université libre de Bruxelles (ULB), Brussels, Belgium; BCBL, Basque Center on Cognition, Brain and Language, San Sebastian, Spain; Laboratory of Neurophysiology and Movement Biomechanics, UNI - ULB Neuroscience Institute, Université libre de Bruxelles (ULB), Brussels, Belgium.
Humans' extraordinary ability to understand speech in noise relies on multiple processes that develop with age. Using magnetoencephalography (MEG), we characterize the underlying neuromaturational basis by quantifying how cortical oscillations in 144 participants (aged 5-27 years) track phrasal and syllabic structures in connected speech mixed with different types of noise. While the extraction of prosodic cues from clear speech was stable during development, its maintenance in a multi-talker background matured rapidly up to age 9 and was associated with speech comprehension.
View Article and Find Full Text PDFPsych J
February 2023
Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing, China.
The adult brain can efficiently track both lower-level (i.e., syllable) and higher-level (i.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!