A failure to build solid letter-speech sound associations may contribute to reading impairments in developmental dyslexia. Whether this reduced neural integration of letters and speech sounds changes over time within individual children and how this relates to behavioral gains in reading skills remains unknown. In this research, we examined changes in event-related potential (ERP) measures of letter-speech sound integration over a 6-month period during which 9-year-old dyslexic readers (n = 17) followed a training in letter-speech sound coupling next to their regular reading curriculum. We presented the Dutch spoken vowels /a/ and /o/ as standard and deviant stimuli in one auditory and two audiovisual oddball conditions. In one audiovisual condition (AV0), the letter "a" was presented simultaneously with the vowels, while in the other (AV200) it was preceding vowel onset for 200 ms. Prior to the training (T1), dyslexic readers showed the expected pattern of typical auditory mismatch responses, together with the absence of letter-speech sound effects in a late negativity (LN) window. After the training (T2), our results showed earlier (and enhanced) crossmodal effects in the LN window. Most interestingly, earlier LN latency at T2 was significantly related to higher behavioral accuracy in letter-speech sound coupling. On a more general level, the timing of the earlier mismatch negativity (MMN) in the simultaneous condition (AV0) measured at T1, significantly related to reading fluency at both T1 and T2 as well as with reading gains. Our findings suggest that the reduced neural integration of letters and speech sounds in dyslexic children may show moderate improvement with reading instruction and training and that behavioral improvements relate especially to individual differences in the timing of this neural integration.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4478392PMC
http://dx.doi.org/10.3389/fnhum.2015.00369DOI Listing

Publication Analysis

Top Keywords

letter-speech sound
24
neural integration
12
dyslexic children
8
sound integration
8
reduced neural
8
integration letters
8
letters speech
8
speech sounds
8
dyslexic readers
8
sound coupling
8

Similar Publications

Article Synopsis
  • The study investigates how the brain processes speech sounds and their corresponding written letters, proposing that both may use similar neural pathways despite differing senses.
  • Participants showed quicker and more defined categorization of spoken sounds than written letters, suggesting a difference in how these modalities are processed.
  • Key brain regions, including the left inferior frontal gyrus and auditory cortex for speech, and early visual cortices for letters, demonstrate that while auditory and visual systems operate distinctly, there is some shared processing for phonetic categorization.
View Article and Find Full Text PDF

The integration of visual letters and speech sounds is a crucial part of learning to read. Previous studies investigating this integration have revealed a modulation by audiovisual (AV) congruency, commonly known as the congruency effect. To investigate the cortical oscillations of the congruency effects across different oscillatory frequency bands, we conducted a Japanese priming task in which a visual letter was followed by a speech sound.

View Article and Find Full Text PDF
Article Synopsis
  • The study investigated whether the brain processes speech sounds (phonemes) and visual letters (graphemes) differently or similarly by analyzing neural mechanisms responsible for phonetic categorization.
  • Participants categorized phonemes and graphemes at different speeds, showing quicker responses for sounds, while EEG results indicated distinct yet overlapping brain regions involved in processing each modality.
  • The findings suggest that while phonetic categorization operates primarily within auditory and visual sensory areas, there is some shared neural representation for phonemic and graphemic information in specific brain regions.
View Article and Find Full Text PDF

The automatic activation of letter-speech sound (L-SS) associations is a vital step in typical reading acquisition. However, the contribution of L-SS integration during nonalphabetic native and alphabetic second language (L2) reading remains unclear. This study explored whether L-SS integration plays a similar role in a nonalphabetic language as in alphabetic languages and its contribution to L2 reading among native Japanese-speaking adults with varying English proficiency.

View Article and Find Full Text PDF

This study investigated the neural basis of letter and speech sound (LS) integration in 53 typical readers (35 girls, all White) during the first 2 years of reading education (ages 7-9). Changes in both sensory (multisensory vs unisensory) and linguistic (congruent vs incongruent) aspects of LS integration were examined. The left superior temporal cortex and bilateral inferior frontal cortex showed increasing activation for multisensory over unisensory LS over time, driven by reduced activation to speech sounds.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!