Interactive models of reading propose that phonological representations directly activate and/or constrain orthographic representations through feedback. These models also predict that spoken words should activate their orthographic forms. The effect of word orthography on auditory lexical access was investigated in two patients with alexia without agraphia. Several theories of alexia suggest that letter-by-letter reading results from impaired access to orthographic representations. Although alexics can often correctly identify orally spelled words and spell to dictation, it is unknown whether they can access the whole orthographic "word-form" as a unit via auditory presentation. The nonobligatory activation of orthography was examined in an auditory lexical decision task, in which the orthographic and phonological similarity between prime and target was manipulated. In controls, the combined effect of phonological and orthographic relatedness (OP) produced greater facilitation than phonological relatedness alone, indicating that orthography can influence auditory lexical decisions. The alexics displayed patterns of facilitation comparable to controls, suggesting they can quickly access whole-word orthographic information via the auditory modality. An alternate account posits that the OP advantage does not require on-line access of orthography, but instead is a developmental by-product of learning to read an orthographically inconsistent language. The results have implications for cognitive theories of alexia and provide support for interactive models of word recognition.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1162/089892903770007371 | DOI Listing |
J Speech Lang Hear Res
January 2025
Department of Special Education, Central China Normal University, Wuhan.
Purpose: This cross-sectional study explored how the speechreading ability of adults with hearing impairment (HI) in China would affect their perception of the four Mandarin Chinese lexical tones: high (Tone 1), rising (Tone 2), falling-rising (Tone 3), and falling (Tone 4). We predicted that higher speechreading ability would result in better tone performance and that accuracy would vary among individual tones.
Method: A total of 136 young adults with HI (ages 18-25 years) in China participated in the study and completed Chinese speechreading and tone awareness tests.
J Acoust Soc Am
January 2025
Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands.
Previous studies suggested that pitch characteristics of lexical tones in Standard Chinese influence various sensory perceptions, but whether they iconically bias emotional experience remained unclear. We analyzed the arousal and valence ratings of bi-syllabic words in two corpora (Study 1) and conducted an affect rating experiment using a carefully designed corpus of bi-syllabic words (Study 2). Two-alternative forced-choice tasks further tested the robustness of lexical tones' affective iconicity in an auditory nonce word context (Study 3).
View Article and Find Full Text PDFJ Neurosci
January 2025
Department of Electrical and Computer Engineering, University of Maryland, College Park, MD, 20742
When we listen to speech, our brain's neurophysiological responses "track" its acoustic features, but it is less well understood how these auditory responses are enhanced by linguistic content. Here, we recorded magnetoencephalography (MEG) responses while subjects of both sexes listened to four types of continuous-speech-like passages: speech-envelope modulated noise, English-like non-words, scrambled words, and a narrative passage. Temporal response function (TRF) analysis provides strong neural evidence for the emergent features of speech processing in cortex, from acoustics to higher-level linguistics, as incremental steps in neural speech processing.
View Article and Find Full Text PDFDev Sci
March 2025
Laboratoire de Sciences Cognitives et de Psycholinguistique, Département d'Études Cognitives, ENS, EHESS, CNRS, PSL University, Paris, France.
Before they even talk, infants become sensitive to the speech sounds of their native language and recognize the auditory form of an increasing number of words. Traditionally, these early perceptual changes are attributed to an emerging knowledge of linguistic categories such as phonemes or words. However, there is growing skepticism surrounding this interpretation due to limited evidence of category knowledge in infants.
View Article and Find Full Text PDFHum Brain Mapp
December 2024
Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
The ability to integrate semantic information into the context of a sentence is essential for human communication. Several studies have shown that the predictability of a final keyword based on the sentence context influences semantic integration on the behavioral, neurophysiological, and neural level. However, the architecture of the underlying network interactions for semantic integration across the lifespan remains unclear.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!