Background: Maternal hypothyroidism and hypothyroxinemia are associated with poor neuropsychological development in children. Previous research is lacking on whether maternal thyroid dysfunction affects sensory and linguistic development in childhood.
Methods: The Northern Finland Birth Cohort 1986 included all births within a year (9,362 women, 9,479 children) from the two northernmost Finnish provinces. Maternal serum samples ( = 5,791) were obtained in early pregnancy and analyzed for TSH, free T4, and thyroid peroxidase antibodies (TPO-Abs). Five thousand three hundred and ninety-one parents evaluated their child's sensory and linguistic development at 7 years old a questionnaire (excluding children with an intelligence quotient ≤85). The prevalence of sensory and linguistic impairments was compared between mothers with and without thyroid dysfunction.
Results: There were no statistically significant differences in the prevalence of sensory or linguistic impairment between children of mothers with and without thyroid dysfunction. Children of hypothyroid and hypothyroxinemic mothers had an increased prevalence of vision impairment compared with those of euthyroid mothers (10.8 and 11.7%, respectively, versus 6.5%), but the difference was not significant. All results remained similar after excluding TPO-Ab-positive mothers and premature children.
Conclusion: We did not find an association between maternal thyroid dysfunction during pregnancy and sensory and linguistic development impairment in childhood. A somewhat higher prevalence of vision impairment was seen in children of hypothyroid and hypothyroxinemic mothers, which merits further research.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5879546 | PMC |
http://dx.doi.org/10.3389/fendo.2018.00127 | DOI Listing |
J Speech Lang Hear Res
January 2025
Research Unit of Logopedics and the Child Language Research Center, University of Oulu, Finland.
Purpose: Children develop social-pragmatic understanding with the help of sensory, cognitive, and linguistic functions by interacting with other people. This study aimed to explore (a) associations between auditory, demographic, cognitive, and linguistic factors and social-pragmatic understanding in children who use bilateral hearing aids (BiHAs) or bilateral cochlear implants (BiCIs) and in typically hearing (TH) children and (b) the effect of the group (BiHA, BiCI, TH) on social-pragmatic understanding when the effects of demographic, cognitive, and linguistic factors are controlled for.
Method: The Pragma test was used to assess social-pragmatic understanding in 119 six-year-old children: 25 children who use BiHAs, 29 who use BiCIs, and 65 TH children.
Elife
January 2025
Department of Psychology, University of York, North Yorkshire, United Kingdom.
Processing pathways between sensory and default mode network (DMN) regions support recognition, navigation, and memory but their organisation is not well understood. We show that functional subdivisions of visual cortex and DMN sit at opposing ends of parallel streams of information processing that support visually mediated semantic and spatial cognition, providing convergent evidence from univariate and multivariate task responses, intrinsic functional and structural connectivity. Participants learned virtual environments consisting of buildings populated with objects, drawn from either a single semantic category or multiple categories.
View Article and Find Full Text PDFPLoS Biol
January 2025
Max Planck Institute for Psycholinguistics, Nijmegen, the Netherlands.
Studies of perception have long shown that the brain adds information to its sensory analysis of the physical environment. A touchstone example for humans is language use: to comprehend a physical signal like speech, the brain must add linguistic knowledge, including syntax. Yet, syntactic rules and representations are widely assumed to be atemporal (i.
View Article and Find Full Text PDFElife
January 2025
State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University & IDG/McGovern Institute for Brain Research, Beijing, China.
Speech comprehension involves the dynamic interplay of multiple cognitive processes, from basic sound perception, to linguistic encoding, and finally to complex semantic-conceptual interpretations. How the brain handles the diverse streams of information processing remains poorly understood. Applying Hidden Markov Modeling to fMRI data obtained during spoken narrative comprehension, we reveal that the whole brain networks predominantly oscillate within a tripartite latent state space.
View Article and Find Full Text PDFJ Acoust Soc Am
January 2025
Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands.
Previous studies suggested that pitch characteristics of lexical tones in Standard Chinese influence various sensory perceptions, but whether they iconically bias emotional experience remained unclear. We analyzed the arousal and valence ratings of bi-syllabic words in two corpora (Study 1) and conducted an affect rating experiment using a carefully designed corpus of bi-syllabic words (Study 2). Two-alternative forced-choice tasks further tested the robustness of lexical tones' affective iconicity in an auditory nonce word context (Study 3).
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!