In transparent alphabetic languages, the expected standard for complete acquisition of letter-speech sound associations is within one year of reading instruction. The neural mechanisms underlying the acquisition of letter-speech sound associations have, however, hardly been investigated. The present article describes an ERP study with beginner and advanced readers in which the influence of letters on speech sound processing is investigated by comparing the MMN to speech sounds presented in isolation with the MMN to speech sounds accompanied by letters. Furthermore, SOA between letter and speech sound presentation was manipulated in order to investigate the development of the temporal window of integration for letter-speech sound processing. Beginner readers, despite one year of reading instruction, showed no early letter-speech sound integration, that is, no influence of the letter on the evocation of the MMN to the speech sound. Only later in the difference wave, at 650 msec, was an influence of the letter on speech sound processing revealed. Advanced readers, with 4 years of reading instruction, showed early and automatic letter-speech sound processing as revealed by an enhancement of the MMN amplitude, however, at a different temporal window of integration in comparison with experienced adult readers. The present results indicate a transition from mere association in beginner readers to more automatic, but still not "adult-like," integration in advanced readers. In contrast to general assumptions, the present study provides evidence for an extended development of letter-speech sound integration.

Download full-text PDF

Source
http://dx.doi.org/10.1162/jocn.2009.21061DOI Listing

Publication Analysis

Top Keywords

letter-speech sound
28
sound processing
20
speech sound
16
reading instruction
12
advanced readers
12
mmn speech
12
sound
11
development letter-speech
8
acquisition letter-speech
8
sound associations
8

Similar Publications

Article Synopsis
  • The study investigates how the brain processes speech sounds and their corresponding written letters, proposing that both may use similar neural pathways despite differing senses.
  • Participants showed quicker and more defined categorization of spoken sounds than written letters, suggesting a difference in how these modalities are processed.
  • Key brain regions, including the left inferior frontal gyrus and auditory cortex for speech, and early visual cortices for letters, demonstrate that while auditory and visual systems operate distinctly, there is some shared processing for phonetic categorization.
View Article and Find Full Text PDF

The integration of visual letters and speech sounds is a crucial part of learning to read. Previous studies investigating this integration have revealed a modulation by audiovisual (AV) congruency, commonly known as the congruency effect. To investigate the cortical oscillations of the congruency effects across different oscillatory frequency bands, we conducted a Japanese priming task in which a visual letter was followed by a speech sound.

View Article and Find Full Text PDF
Article Synopsis
  • The study investigated whether the brain processes speech sounds (phonemes) and visual letters (graphemes) differently or similarly by analyzing neural mechanisms responsible for phonetic categorization.
  • Participants categorized phonemes and graphemes at different speeds, showing quicker responses for sounds, while EEG results indicated distinct yet overlapping brain regions involved in processing each modality.
  • The findings suggest that while phonetic categorization operates primarily within auditory and visual sensory areas, there is some shared neural representation for phonemic and graphemic information in specific brain regions.
View Article and Find Full Text PDF

The automatic activation of letter-speech sound (L-SS) associations is a vital step in typical reading acquisition. However, the contribution of L-SS integration during nonalphabetic native and alphabetic second language (L2) reading remains unclear. This study explored whether L-SS integration plays a similar role in a nonalphabetic language as in alphabetic languages and its contribution to L2 reading among native Japanese-speaking adults with varying English proficiency.

View Article and Find Full Text PDF

This study investigated the neural basis of letter and speech sound (LS) integration in 53 typical readers (35 girls, all White) during the first 2 years of reading education (ages 7-9). Changes in both sensory (multisensory vs unisensory) and linguistic (congruent vs incongruent) aspects of LS integration were examined. The left superior temporal cortex and bilateral inferior frontal cortex showed increasing activation for multisensory over unisensory LS over time, driven by reduced activation to speech sounds.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!