Our motivation was to examine how toddler (2;6) and adult speakers of American English prosodically realize information status categories. The aims were three-fold: 1) to analyze how adults phonologically make information status distinctions; 2) to examine how these same categories are signaled in toddlers' spontaneous speech; and 3) to analyze the three primary acoustic correlates of prosody (F0, intensity, and duration). During a spontaneous speech task designed as an interactive game, a set of target nouns was elicited as one of three types (new, given, corrective). Results show that toddlers primarily used H* across information status categories, with secondary preferences for deaccenting given information and for using L+H* for corrective information. Only duration distinguished information status, and duration, average pitch, and intensity differentiated pitch accent types for both adults and children. Discussion includes how pitch accent selection and input play a role in guiding prosodic realizations of information status.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8567208 | PMC |
http://dx.doi.org/10.1017/S0305000920000434 | DOI Listing |
Congenit Anom (Kyoto)
January 2025
Division of Research and Treatment for Oral and Maxillofacial Congenital Anomalies, School of Dentistry, Aichi Gakuin University, Nagoya, Japan.
Pregnancy loss is a significant concern worldwide, encompassing miscarriage and stillbirth. Miscarriage, defined as the loss of a baby before 28 weeks of gestation, accounts for approximately 15% of pregnancies. Stillbirth, occurring at or after 28 weeks of gestation, affects nearly 2.
View Article and Find Full Text PDFFront Hum Neurosci
January 2025
Department of Psychology, Renmin University of China, Beijing, China.
Introduction: While considerable research in language production has focused on incremental processing during conceptual and grammatical encoding, prosodic encoding remains less investigated. This study examines whether focus and accentuation processing in speech production follows linear or hierarchical incrementality.
Methods: We employed visual world eye-tracking to investigate how focus and accentuation are processed during sentence production.
Biol Psychiatry
January 2025
Psychiatry and Neuroscience Departments, Icahn School of Medicine at Mount Sinai, 1 Gustave L. Levy Place, New York City, NY, 10029; Psychiatry and Neuroscience Departments, Icahn School of Medicine at Mount Sinai, 1 Gustave L. Levy Place, New York City, NY, 10029. Electronic address:
Background: Valid scalable biomarkers for predicting longitudinal clinical outcomes in psychiatric research are crucial for optimizing intervention and prevention efforts. Here we recorded spontaneous speech from initially abstinent individuals with cocaine use disorder (iCUD) for use in predicting drug use outcomes.
Methods: At baseline, 88 iCUD provided 5-minute speech samples describing the positive consequences of quitting drug use and negative consequences of using drugs.
Lang Learn Dev
April 2024
Department of Literatures, Cultures and Languages, University of Connecticut, Storrs, Connecticut, USA.
Joint Attention (JA) and Supported Joint Engagement (Supported JE) have each been reported to predict later language development in typically developing (TD) children and children with Autism Spectrum Disorder (ASD). In this longitudinal study including 33 TD children (20 months at V1) and 30 children with ASD (33 months at V1), the contributions of JA and Supported JE to later language, assessed via standardized tests and spontaneous speech, were directly compared. Frequency and durations of JA and Supported JE episodes were coded from 30-minute interactions with caregivers; subsequent language skills were assessed two years later.
View Article and Find Full Text PDFbioRxiv
January 2025
Oregon Hearing Research Center and Vollum Institute, Oregon Health & Science University, Portland, Oregon, 97239.
Exposure to loud and/or prolonged noise damages cochlear hair cells and triggers downstream changes in synaptic and electrical activity in multiple brain regions, resulting in hearing loss and altered speech comprehension. It remains unclear however whether or not noise exposure also compromises the cochlear efferent system, a feedback pathway in the brain that fine-tunes hearing sensitivity in the cochlea. We examined the effects of noise-induced hearing loss on the spontaneous action potential (AP) firing pattern in mouse lateral olivocochlear (LOC) neurons.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!