A failure to build solid letter-speech sound associations may contribute to reading impairments in developmental dyslexia. Whether this reduced neural integration of letters and speech sounds changes over time within individual children and how this relates to behavioral gains in reading skills remains unknown. In this research, we examined changes in event-related potential (ERP) measures of letter-speech sound integration over a 6-month period during which 9-year-old dyslexic readers (n = 17) followed a training in letter-speech sound coupling next to their regular reading curriculum. We presented the Dutch spoken vowels /a/ and /o/ as standard and deviant stimuli in one auditory and two audiovisual oddball conditions. In one audiovisual condition (AV0), the letter "a" was presented simultaneously with the vowels, while in the other (AV200) it was preceding vowel onset for 200 ms. Prior to the training (T1), dyslexic readers showed the expected pattern of typical auditory mismatch responses, together with the absence of letter-speech sound effects in a late negativity (LN) window. After the training (T2), our results showed earlier (and enhanced) crossmodal effects in the LN window. Most interestingly, earlier LN latency at T2 was significantly related to higher behavioral accuracy in letter-speech sound coupling. On a more general level, the timing of the earlier mismatch negativity (MMN) in the simultaneous condition (AV0) measured at T1, significantly related to reading fluency at both T1 and T2 as well as with reading gains. Our findings suggest that the reduced neural integration of letters and speech sounds in dyslexic children may show moderate improvement with reading instruction and training and that behavioral improvements relate especially to individual differences in the timing of this neural integration.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4478392 | PMC |
http://dx.doi.org/10.3389/fnhum.2015.00369 | DOI Listing |
Neuroscience
January 2025
School of Communication Sciences & Disorders, University of Memphis, Memphis, TN, USA.
Eur J Neurosci
December 2024
Faculty of Education, Hokkaido University, Sapporo, Japan.
The integration of visual letters and speech sounds is a crucial part of learning to read. Previous studies investigating this integration have revealed a modulation by audiovisual (AV) congruency, commonly known as the congruency effect. To investigate the cortical oscillations of the congruency effects across different oscillatory frequency bands, we conducted a Japanese priming task in which a visual letter was followed by a speech sound.
View Article and Find Full Text PDFJ Cogn Neurosci
June 2024
Faculty of Education, Hokkaido University, Japan.
The automatic activation of letter-speech sound (L-SS) associations is a vital step in typical reading acquisition. However, the contribution of L-SS integration during nonalphabetic native and alphabetic second language (L2) reading remains unclear. This study explored whether L-SS integration plays a similar role in a nonalphabetic language as in alphabetic languages and its contribution to L2 reading among native Japanese-speaking adults with varying English proficiency.
View Article and Find Full Text PDFChild Dev
July 2024
Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland.
This study investigated the neural basis of letter and speech sound (LS) integration in 53 typical readers (35 girls, all White) during the first 2 years of reading education (ages 7-9). Changes in both sensory (multisensory vs unisensory) and linguistic (congruent vs incongruent) aspects of LS integration were examined. The left superior temporal cortex and bilateral inferior frontal cortex showed increasing activation for multisensory over unisensory LS over time, driven by reduced activation to speech sounds.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!