Werker and Tees (1984) prompted decades of research attempting to detail the paths infants take towards specialisation for the sounds of their native language(s). Most of this research has examined the trajectories of monolingual children. However, it has also been proposed that bilinguals, who are exposed to greater phonetic variability than monolinguals and must learn the rules of two languages, may remain perceptually open to non-native language sounds later into life than monolinguals.
View Article and Find Full Text PDFBilingual infants rely differently than monolinguals on facial information, such as lip patterns, to differentiate their native languages. This may explain, at least in part, why young monolinguals and bilinguals show differences in social attention. For example, in the first year, bilinguals attend faster and more often to static faces over non-faces than do monolinguals (Mercure et al.
View Article and Find Full Text PDFBackground: Most people have strong left-brain lateralisation for language, with a minority showing right- or bilateral language representation. On some receptive language tasks, however, lateralisation appears to be reduced or absent. This contrasting pattern raises the question of whether and how language laterality may fractionate within individuals.
View Article and Find Full Text PDF: Lateralised language processing is a well-established finding in monolinguals. In bilinguals, studies using fMRI have typically found substantial regional overlap between the two languages, though results may be influenced by factors such as proficiency, age of acquisition and exposure to the second language. Few studies have focused specifically on individual differences in brain lateralisation, and those that have suggested reduced lateralisation may characterise representation of the second language (L2) in some bilingual individuals.
View Article and Find Full Text PDFVisual information conveyed by a speaking face aids speech perception. In addition, children's ability to comprehend visual-only speech (speechreading ability) is related to phonological awareness and reading skills in both deaf and hearing children. We tested whether training speechreading would improve speechreading, phoneme blending, and reading ability in hearing children.
View Article and Find Full Text PDFPurpose Speechreading (lipreading) is a correlate of reading ability in both deaf and hearing children. We investigated whether the relationship between speechreading and single-word reading is mediated by phonological awareness in deaf and hearing children. Method In two separate studies, 66 deaf children and 138 hearing children, aged 5-8 years old, were assessed on measures of speechreading, phonological awareness, and single-word reading.
View Article and Find Full Text PDFWhen people talk, they move their hands to enhance meaning. Using accelerometry, we measured whether people spontaneously use their artificial limbs (prostheses) to gesture, and whether this behavior relates to everyday prosthesis use and perceived embodiment. Perhaps surprisingly, one- and two-handed participants did not differ in the number of gestures they produced in gesture-facilitating tasks.
View Article and Find Full Text PDFRecent neuroimaging studies suggest that monolingual infants activate a left-lateralized frontotemporal brain network in response to spoken language, which is similar to the network involved in processing spoken and signed language in adulthood. However, it is unclear how brain activation to language is influenced by early experience in infancy. To address this question, we present functional near-infrared spectroscopy (fNIRS) data from 60 hearing infants (4 to 8 months of age): 19 monolingual infants exposed to English, 20 unimodal bilingual infants exposed to two spoken languages, and 21 bimodal bilingual infants exposed to English and British Sign Language (BSL).
View Article and Find Full Text PDFDeaf late signers provide a unique perspective on the impact of impoverished early language exposure on the neurobiology of language: insights that cannot be gained from research with hearing people alone. Here we contrast the effect of age of sign language acquisition in hearing and congenitally deaf adults to examine the potential impact of impoverished early language exposure on the neural systems supporting a language learnt later in life. We collected fMRI data from deaf and hearing proficient users (N = 52) of British Sign Language (BSL), who learnt BSL either early (native) or late (after the age of 15 years) whilst they watched BSL sentences or strings of meaningless nonsense signs.
View Article and Find Full Text PDFConceptual knowledge is fundamental to human cognition. Yet, the extent to which it is influenced by language is unclear. Studies of semantic processing show that similar neural patterns are evoked by the same concepts presented in different modalities (e.
View Article and Find Full Text PDFPurpose We developed and evaluated in a randomized controlled trial a computerized speechreading training program to determine (a) whether it is possible to train speechreading in deaf children and (b) whether speechreading training results in improvements in phonological and reading skills. Previous studies indicate a relationship between speechreading and reading skill and further suggest this relationship may be mediated by improved phonological representations. This is important since many deaf children find learning to read to be very challenging.
View Article and Find Full Text PDFThe effect of sensory experience on hemispheric specialisation for language production is not well understood. Children born deaf, including those who have cochlear implants, have drastically different perceptual experiences of language than their hearing peers. Using functional transcranial Doppler sonography (fTCD), we measured lateralisation during language production in a heterogeneous group of 19 deaf children and in 19 hearing children, matched on language ability.
View Article and Find Full Text PDFFaces capture and maintain infants' attention more than other visual stimuli. The present study addresses the impact of early language experience on attention to faces in infancy. It was hypothesized that infants learning two spoken languages (unimodal bilinguals) and hearing infants of Deaf mothers learning British Sign Language and spoken English (bimodal bilinguals) would show enhanced attention to faces compared to monolinguals.
View Article and Find Full Text PDFInfants as young as 2 months can integrate audio and visual aspects of speech articulation. A shift of attention from the eyes towards the mouth of talking faces occurs around 6 months of age in monolingual infants. However, it is unknown whether this pattern of attention during audiovisual speech processing is influenced by speech and language experience in infancy.
View Article and Find Full Text PDFFor children who are born deaf, lipreading (speechreading) is an important source of access to spoken language. We used eye tracking to investigate the strategies used by deaf ( = 33) and hearing 5-8-year-olds ( = 59) during a sentence speechreading task. The proportion of time spent looking at the mouth during speech correlated positively with speechreading accuracy.
View Article and Find Full Text PDFTo investigate how hearing status, sign language experience, and task demands influence functional responses in the human superior temporal cortices (STC) we collected fMRI data from deaf and hearing participants (male and female), who either acquired sign language early or late in life. Our stimuli in all tasks were pictures of objects. We varied the linguistic and visuospatial processing demands in three different tasks that involved decisions about (1) the sublexical (phonological) structure of the British Sign Language (BSL) signs for the objects, (2) the semantic category of the objects, and (3) the physical features of the objects.
View Article and Find Full Text PDFPrevious research has provided evidence for a speechreading advantage in congenitally deaf adults compared to hearing adults. A 'perceptual compensation' account of this finding proposes that prolonged early onset deafness leads to a greater reliance on visual, as opposed to auditory, information when perceiving speech which in turn results in superior visual speech perception skills in deaf adults. In the current study we tested whether previous demonstrations of a speechreading advantage for profoundly congenitally deaf adults with hearing aids, or no amplificiation, were also apparent in adults with the same deafness profile but who have experienced greater access to the auditory elements of speech via a cochlear implant (CI).
View Article and Find Full Text PDFThe neural systems supporting speech and sign processing are very similar, although not identical. In a previous fTCD study of hearing native signers (Gutierrez-Sigut, Daws, et al., 2015) we found stronger left lateralization for sign than speech.
View Article and Find Full Text PDFNeuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL).
View Article and Find Full Text PDFBackground: Vocabulary knowledge and speechreading are important for deaf children's reading development but it is unknown whether they are independent predictors of reading ability.
Aims: This study investigated the relationships between reading, speechreading and vocabulary in a large cohort of deaf and hearing children aged 5 to 14 years.
Methods And Procedures: 86 severely and profoundly deaf children and 91 hearing children participated in this study.
Here we adopt a novel strategy to investigate phonological assembly. Participants performed a visual lexical decision task in English in which the letters in words and letterstrings were delivered either sequentially (promoting phonological assembly) or simultaneously (not promoting phonological assembly). A region of interest analysis confirmed that regions previously associated with phonological assembly, in studies contrasting different word types (e.
View Article and Find Full Text PDFStudies to date that have used fTCD to examine language lateralisation have predominantly used word or sentence generation tasks. Here we sought to further assess the sensitivity of fTCD to language lateralisation by using a metalinguistic task which does not involve novel speech generation: rhyme judgement in response to written words. Line array judgement was included as a non-linguistic visuospatial task to examine the relative strength of left and right hemisphere lateralisation within the same individuals when output requirements of the tasks are matched.
View Article and Find Full Text PDFCochlear implantation (CI) for profound congenital hearing impairment, while often successful in restoring hearing to the deaf child, does not always result in effective speech processing. Exposure to non-auditory signals during the pre-implantation period is widely held to be responsible for such failures. Here, we question the inference that such exposure irreparably distorts the function of auditory cortex, negatively impacting the efficacy of CI.
View Article and Find Full Text PDF