In transparent alphabetic languages, the expected standard for complete acquisition of letter-speech sound associations is within one year of reading instruction. The neural mechanisms underlying the acquisition of letter-speech sound associations have, however, hardly been investigated. The present article describes an ERP study with beginner and advanced readers in which the influence of letters on speech sound processing is investigated by comparing the MMN to speech sounds presented in isolation with the MMN to speech sounds accompanied by letters. Furthermore, SOA between letter and speech sound presentation was manipulated in order to investigate the development of the temporal window of integration for letter-speech sound processing. Beginner readers, despite one year of reading instruction, showed no early letter-speech sound integration, that is, no influence of the letter on the evocation of the MMN to the speech sound. Only later in the difference wave, at 650 msec, was an influence of the letter on speech sound processing revealed. Advanced readers, with 4 years of reading instruction, showed early and automatic letter-speech sound processing as revealed by an enhancement of the MMN amplitude, however, at a different temporal window of integration in comparison with experienced adult readers. The present results indicate a transition from mere association in beginner readers to more automatic, but still not "adult-like," integration in advanced readers. In contrast to general assumptions, the present study provides evidence for an extended development of letter-speech sound integration.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1162/jocn.2009.21061 | DOI Listing |
Neuroscience
January 2025
School of Communication Sciences & Disorders, University of Memphis, Memphis, TN, USA.
Eur J Neurosci
December 2024
Faculty of Education, Hokkaido University, Sapporo, Japan.
The integration of visual letters and speech sounds is a crucial part of learning to read. Previous studies investigating this integration have revealed a modulation by audiovisual (AV) congruency, commonly known as the congruency effect. To investigate the cortical oscillations of the congruency effects across different oscillatory frequency bands, we conducted a Japanese priming task in which a visual letter was followed by a speech sound.
View Article and Find Full Text PDFJ Cogn Neurosci
June 2024
Faculty of Education, Hokkaido University, Japan.
The automatic activation of letter-speech sound (L-SS) associations is a vital step in typical reading acquisition. However, the contribution of L-SS integration during nonalphabetic native and alphabetic second language (L2) reading remains unclear. This study explored whether L-SS integration plays a similar role in a nonalphabetic language as in alphabetic languages and its contribution to L2 reading among native Japanese-speaking adults with varying English proficiency.
View Article and Find Full Text PDFChild Dev
July 2024
Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences (PAS), Warsaw, Poland.
This study investigated the neural basis of letter and speech sound (LS) integration in 53 typical readers (35 girls, all White) during the first 2 years of reading education (ages 7-9). Changes in both sensory (multisensory vs unisensory) and linguistic (congruent vs incongruent) aspects of LS integration were examined. The left superior temporal cortex and bilateral inferior frontal cortex showed increasing activation for multisensory over unisensory LS over time, driven by reduced activation to speech sounds.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!