The consequences of profound deafness on oral language development in children are drastic and well-known. Modern multichannel cochlear implant (CI) has been proven to enhance speech production skills in prelingually deaf children. Speech production skills, however, are known not to be a reliable reflection of oral language competence as a whole. Language is an acquired common code in a specific group, enabling exchange of ideas, feelings and knowledge. In humans, speech is one of the channels conveying language. Assessing language development in CI children is more difficult than simply assessing speech production skills. Many factors may contribute to a poor or an excellent outcome, making it difficult to compare groups of children wearing or not wearing CI. The present study compared receptive language levels in paired matched children from CI and non-CI groups. The main conclusion of this study is that language comprehension scores grow significantly higher over time post-surgery in CI than in paired-matched non-CI children, despite better initial pure tone audiometric thresholds of the latter.
Download full-text PDF |
Source |
---|
Ophthalmologie
January 2025
Augenklinik Sulzbach, Knappschaftsklinikum Saar, An der Klinik 10, 66280, Sulzbach/Saar, Deutschland.
Background: The increasing bureaucratic burden in everyday clinical practice impairs doctor-patient communication (DPC). Effective use of digital technologies, such as automated semantic speech recognition (ASR) with automated extraction of diagnostically relevant information can provide a solution.
Objective: The aim was to determine the extent to which ASR in conjunction with semantic information extraction for automated documentation of the doctor-patient dialogue (ADAPI) can be integrated into everyday clinical practice using the IVI routine as an example and whether patient care can be improved through process optimization.
J Neural Eng
January 2025
Department of Neurology, Northwestern University Feinberg School of Medicine, 320 East Superior St, Chicago, IL 60611, USA, Chicago, Illinois, 60611, UNITED STATES.
Brain-machine interfaces (BMIs) have advanced greatly in decoding speech signals originating from the speech motor cortices. Primarily, these BMIs target individuals with intact speech motor cortices but who are paralyzed by disrupted connections between frontal cortices and their articulators due to brainstem stroke or motor neuron diseases such as amyotrophic lateral sclerosis. A few studies have shown some information outside the speech motor cortices, such as in parietal and temporal lobes, that also may be useful for BMIs.
View Article and Find Full Text PDFSupport Care Cancer
January 2025
Department of Medical Oncology, Netherlands Cancer Institute - Antoni van Leeuwenhoek, 1066 CX, Amsterdam, the Netherlands.
Purpose: Adolescent and young adult (AYA) malignant brain tumour (BT) survivors are at risk of adverse health outcomes, which may impact their health-related quality of life (HRQoL). This study aimed to investigate the (1) prevalence of physical and psychological adverse health outcomes, (2) the HRQoL, and (3) the association of adverse health outcomes and HRQoL among long-term AYA-BT survivors. Adverse health outcomes and HRQoL were compared to other AYA cancer (AYAC) survivors.
View Article and Find Full Text PDFDev Sci
March 2025
Department of Pediatrics and Adolescent Medicine, Comprehensive Center for Pediatrics, Medical University of Vienna, Vienna, Austria.
Newborns are able to neurally discriminate between speech and nonspeech right after birth. To date it remains unknown whether this early speech discrimination and the underlying neural language network is associated with later language development. Preterm-born children are an interesting cohort to investigate this relationship, as previous studies have shown that preterm-born neonates exhibit alterations of speech processing and have a greater risk of later language deficits.
View Article and Find Full Text PDFiScience
January 2025
Vivian L. Smith Department of Neurosurgery, McGovern Medical School at UT Health Houston, Houston, TX 77030, United States of America.
Speech production engages a distributed network of cortical and subcortical brain regions. The supplementary motor area (SMA) has long been thought to be a key hub in coordinating across these regions to initiate voluntary movements, including speech. We analyzed direct intracranial recordings from 115 patients with epilepsy as they articulated a single word in a subset of trials from a picture-naming task.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!