Little is known about the neural changes that accompany sign language learning by hearing adults. We used ERPs and a word-sign matching task to assess how learning impacted the N400 priming effect (reduced negativity for translations compared to unrelated trials). English monolinguals (N = 32) learned 100 ASL signs - half highly iconic (meaning was guessable), half non-iconic.
View Article and Find Full Text PDFThe type of form-meaning mapping for iconic signs can vary. For perceptually-iconic signs there is a correspondence between visual features of a referent (e.g.
View Article and Find Full Text PDFGrainger et al. (2006) were the first to use ERP masked priming to explore the differing contributions of phonological and orthographic representations to visual word processing. Here we adapted their paradigm to examine word processing in deaf readers.
View Article and Find Full Text PDFChinese-English bilinguals read paragraphs with language switches using a rapid serial visual presentation paradigm silently while ERPs were measured (Experiment 1) or read them aloud (Experiment 2). Each paragraph was written in either Chinese or English with several function or content words switched to the other language. In Experiment 1, language switches elicited an early, long-lasting positivity when switching from the dominant language to the nondominant language, but when switching to the dominant language, the positivity started later, and was never larger than when switching to the nondominant language.
View Article and Find Full Text PDFLetter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL-English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored.
View Article and Find Full Text PDFThis article discusses the development, implementation and evaluation of clinical board games cafes in an undergraduate nurse education programme. Drawing on previous relevant literature about gaming approaches in education, the benefits and impact on student learning is presented. Thematic analysis of student feedback suggests that participation provided an opportunity to safely practise clinical scenarios and imbed concepts, as well as time to socialise to build support networks.
View Article and Find Full Text PDFPrior research has found that iconicity facilitates sign production in picture-naming paradigms and has effects on ERP components. These findings may be explained by two separate hypotheses: (1) a task-specific hypothesis that suggests these effects occur because visual features of the iconic sign form can map onto the visual features of the pictures, and (2) a semantic feature hypothesis that suggests that the retrieval of iconic signs results in greater semantic activation due to the robust representation of sensory-motor semantic features compared to non-iconic signs. To test these two hypotheses, iconic and non-iconic American Sign Language (ASL) signs were elicited from deaf native/early signers using a picture-naming task and an English-to-ASL translation task, while electrophysiological recordings were made.
View Article and Find Full Text PDFWe examined how readers process content and function words in sentence comprehension with ERPs. Participants read simple declarative sentences using a rapid serial visual presentation (RSVP) with flankers paradigm. Sentences contained either an unexpected semantically anomalous content word, an unexpected syntactically anomalous function word or were well formed with no anomalies.
View Article and Find Full Text PDFThe role of phonology in word recognition has previously been investigated using a masked lexical decision task and transposed letter (TL) nonwords that were either pronounceable (barve) or unpronounceable (brvae). We used event-related potentials (ERPs) to investigate these effects in skilled deaf readers, who may be more sensitive to orthotactic than phonotactic constraints, which are conflated in English. Twenty deaf and twenty hearing adults completed a masked lexical decision task while ERPs were recorded.
View Article and Find Full Text PDFNeurobiol Lang (Camb)
December 2021
Models vary in the extent to which language control processes are domain general. Those that posit that language control is at least partially domain general insist on an overlap between language control and executive control at the goal level. To further probe whether or not language control is domain general, we conducted the first event-related potential (ERP) study that directly compares language-switch costs, as an index of language control, and task-switch costs, as an index of executive control.
View Article and Find Full Text PDFForm priming has been used to identify and demarcate the processes that underlie word and sign recognition. The facilitation that results from the prime and target being related in form is typically interpreted in terms of pre-activation of linguistic representations, with little to no consideration for the potential contributions of increased perceptual overlap between related pairs. Indeed, isolating the contribution of perceptual similarity is impossible in spoken languages; there are no listeners who can perceive speech but have not acquired a sound-based phonological system.
View Article and Find Full Text PDFRepetition priming and event-related potentials (ERPs) were used to investigate the time course of sign recognition in deaf users of American Sign Language. Signers performed a go/no-go semantic categorization task to rare probe signs referring to people; critical target items were repeated and unrelated signs. In Experiment 1, ERPs were time-locked either to the onset of the video or to sign onset within the video; in Experiment 2, the same full videos were clipped so that video and sign onset were aligned (removing transitional movements), and ERPs were time-locked to video/sign onset.
View Article and Find Full Text PDFEvent-related potentials (ERPs) were used to explore the effects of iconicity and structural visual alignment between a picture-prime and a sign-target in a picture-sign matching task in American Sign Language (ASL). Half the targets were iconic signs and were presented after a) a matching visually-aligned picture (e.g.
View Article and Find Full Text PDFIt is currently unclear to what degree language control, which minimizes non-target language interference and increases the probability of selecting target-language words, is similar for sign-speech (bimodal) bilinguals and spoken language (unimodal) bilinguals. To further investigate the nature of language control processes in bimodal bilinguals, we conducted the first event-related potential (ERP) language switching study with hearing American Sign Language (ASL)-English bilinguals. The results showed a pattern that has not been observed in any unimodal language switching study: a switch-related positivity over anterior sites and a switch-related negativity over posterior sites during ASL production in both early and late time windows.
View Article and Find Full Text PDFThe picture word interference (PWI) paradigm and ERPs were used to investigate whether lexical selection in deaf and hearing ASL-English bilinguals occurs via lexical competition or whether the response exclusion hypothesis (REH) for PWI effects is supported. The REH predicts that semantic interference should not occur for bimodal bilinguals because sign and word responses do not compete within an output buffer. Bimodal bilinguals named pictures in ASL, preceded by either a translation equivalent, semantically-related, or unrelated English written word.
View Article and Find Full Text PDFWe used phonological priming and ERPs to investigate the organization of the lexicon in American Sign Language. Across go/no-go repetition detection and semantic categorization tasks, targets in related pairs that shared handshape and location elicited smaller N400s than targets in unrelated pairs, indicative of facilitated processing. Handshape-related targets also elicited smaller N400s than unrelated targets, but only in the repetition task.
View Article and Find Full Text PDFThis study used ERPs to a) assess the neural correlates of cross-linguistic, cross-modal translation priming in hearing beginning learners of American Sign Language (ASL) and deaf highly proficient signers and b) examine whether sign iconicity modulates these priming effects. Hearing learners exhibited translation priming for ASL signs preceded by English words (greater negativity for unrelated than translation primes) later in the ERP waveform than deaf signers and exhibited earlier and greater priming for iconic than non-iconic signs. Iconicity did not modulate translation priming effects either behaviorally or in the ERPs for deaf signers (except in a 800-1000 ms time window).
View Article and Find Full Text PDFA picture-naming task and ERPs were used to investigate effects of iconicity and visual alignment between signs and pictures in American Sign Language (ASL). For iconic signs, half the pictures visually overlapped with phonological features of the sign (e.g.
View Article and Find Full Text PDFDeaf readers provide unique insights into how the reading circuit is modified by altered linguistic and sensory input. We investigated whether reading-matched deaf and hearing readers (n = 62) exhibit different ERP effects associated with orthographic to phonological mapping (N250) or lexico-semantic processes (N400). In a visual masked priming paradigm, participants performed a go/no-go categorization task; target words were preceded by repeated or unrelated primes.
View Article and Find Full Text PDFTo investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults ( = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer's hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable.
View Article and Find Full Text PDFPhonology is often assumed to play a role in the tuning of orthographic representations, but it is unknown whether deaf readers' reduced access to spoken phonology reduces orthographic precision. To index how precisely deaf and hearing readers encode orthographic information, we used a masked transposed-letter (TL) priming paradigm. Word targets were preceded by TL primes formed by reversing two letters in the word and substitution primes in which the same two letters were replaced.
View Article and Find Full Text PDFPrevious studies with deaf adults reported reduced N170 waveform asymmetry to visual words, a finding attributed to reduced phonological mapping in left-hemisphere temporal regions compared to hearing adults. An open question remains whether this pattern indeed results from reduced phonological processing or from general neurobiological adaptations in visual processing of deaf individuals. Deaf ASL signers and hearing nonsigners performed a same-different discrimination task with visually presented words, faces, or cars, while scalp EEG time-locked to the onset of the first item in each pair was recorded.
View Article and Find Full Text PDFA domain-general monitoring mechanism is proposed to be involved in overt speech monitoring. This mechanism is reflected in a medial frontal component, the error negativity (Ne), present in both errors and correct trials (Ne-like wave) but larger in errors than correct trials. In overt speech production, this negativity starts to rise before speech onset and is therefore associated with inner speech monitoring.
View Article and Find Full Text PDFIn masked priming studies with hearing readers, neighbouring words (e.g., , ) compete through lateral inhibition.
View Article and Find Full Text PDF