Sign languages are natural languages in the visual domain. Because they lack a written form, they provide a sharper tool than spoken languages for investigating lexicality effects which may be confounded by orthographic processing. In a previous study, we showed that the neural networks supporting phoneme monitoring in deaf British Sign Language (BSL) users are modulated by phonology but not lexicality or iconicity.
View Article and Find Full Text PDFWorking memory (WM) for spoken language improves when the to-be-remembered items correspond to preexisting representations in long-term memory. We investigated whether this effect generalizes to the visuospatial domain by administering a visual n-back WM task to deaf signers and hearing signers, as well as to hearing nonsigners. Four different kinds of stimuli were presented: British Sign Language (BSL; familiar to the signers), Swedish Sign Language (SSL; unfamiliar), nonsigns, and nonlinguistic manual actions.
View Article and Find Full Text PDFThe study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions.
View Article and Find Full Text PDFSensory cortices undergo crossmodal reorganisation as a consequence of sensory deprivation. Congenital deafness in humans represents a particular case with respect to other types of sensory deprivation, because cortical reorganisation is not only a consequence of auditory deprivation, but also of language-driven mechanisms. Visual crossmodal plasticity has been found in secondary auditory cortices of deaf individuals, but it is still unclear if reorganisation also takes place in primary auditory areas, and how this relates to language modality and auditory deprivation.
View Article and Find Full Text PDFSimilar working memory (WM) for lexical items has been demonstrated for signers and non-signers while short-term memory (STM) is regularly poorer in deaf than hearing individuals. In the present study, we investigated digit-based WM and STM in Swedish and British deaf signers and hearing non-signers. To maintain good experimental control we used printed stimuli throughout and held response mode constant across groups.
View Article and Find Full Text PDFDisentangling the effects of sensory and cognitive factors on neural reorganization is fundamental for establishing the relationship between plasticity and functional specialization. Auditory deprivation in humans provides a unique insight into this problem, because the origin of the anatomical and functional changes observed in deaf individuals is not only sensory, but also cognitive, owing to the implementation of visual communication strategies such as sign language and speechreading. Here, we describe a functional magnetic resonance imaging study of individuals with different auditory deprivation and sign language experience.
View Article and Find Full Text PDFIn this study, the first to explore the cortical correlates of signed language (SL) processing under point-light display conditions, the observer identified either a signer or a lexical sign from a display in which different signers were seen producing a number of different individual signs. Many of the regions activated by point-light under these conditions replicated those previously reported for full-image displays, including regions within the inferior temporal cortex that are specialised for face and body-part identification, although such body parts were invisible in the display. Right frontal regions were also recruited - a pattern not usually seen in full-image SL processing.
View Article and Find Full Text PDFStudies of spoken and signed language processing reliably show involvement of the posterior superior temporal cortex. This region is also reliably activated by observation of meaningless oral and manual actions. In this study we directly compared the extent to which activation in posterior superior temporal cortex is modulated by linguistic knowledge irrespective of differences in language form.
View Article and Find Full Text PDFStudies of written and spoken language suggest that nonidentical brain networks support semantic and syntactic processing. Event-related brain potential (ERP) studies of spoken and written languages show that semantic anomalies elicit a posterior bilateral N400, whereas syntactic anomalies elicit a left anterior negativity, followed by a broadly distributed late positivity. The present study assessed whether these ERP indicators index the activity of language systems specific for the processing of aural-oral language or if they index neural systems underlying any natural language, including sign language.
View Article and Find Full Text PDFMost of our knowledge about the neurobiological bases of language comes from studies of spoken languages. By studying signed languages, we can determine whether what we have learnt so far is characteristic of language per se or whether it is specific to languages that are spoken and heard. Overwhelmingly, lesion and neuroimaging studies indicate that the neural systems supporting signed and spoken language are very similar: both involve a predominantly left-lateralised perisylvian network.
View Article and Find Full Text PDFSpoken languages use one set of articulators -- the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used functional magnetic resonance imaging to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders.
View Article and Find Full Text PDFThis fMRI study explored the functional neural organisation of seen speech in congenitally deaf native signers and hearing non-signers. Both groups showed extensive activation in perisylvian regions for speechreading words compared to viewing the model at rest. In contrast to earlier findings, activation in left middle and posterior portions of superior temporal cortex, including regions within the lateral sulcus and the superior and middle temporal gyri, was greater for deaf than hearing participants.
View Article and Find Full Text PDFIn fingerspelling, different hand configurations are used to represent the different letters of the alphabet. Signers use this method of representing written language to fill lexical gaps in a signed language. Using fMRI, we compared cortical networks supporting the perception of fingerspelled, signed, written, and pictorial stimuli in deaf native signers of British Sign Language (BSL).
View Article and Find Full Text PDFNeuroimaging studies of written and spoken sentence processing report greater left hemisphere than right hemisphere activation. However, a large majority of our experience with language is face-to-face interaction, which is much richer in information. The current study examines the neural organization of audio-visual (AV) sentence processing using functional magnetic resonance imaging (fMRI) at 4 Tesla.
View Article and Find Full Text PDF