Publications by authors named "Holcomb P"

Little is known about the neural changes that accompany sign language learning by hearing adults. We used ERPs and a word-sign matching task to assess how learning impacted the N400 priming effect (reduced negativity for translations compared to unrelated trials). English monolinguals (N = 32) learned 100 ASL signs - half highly iconic (meaning was guessable), half non-iconic.

View Article and Find Full Text PDF

The type of form-meaning mapping for iconic signs can vary. For perceptually-iconic signs there is a correspondence between visual features of a referent (e.g.

View Article and Find Full Text PDF

When a sequence of written words is briefly presented and participants are asked to identify just one word at a post-cued location, then word identification accuracy is higher when the word is presented in a grammatically correct sequence compared with an ungrammatical sequence. This sentence superiority effect has been reported in several behavioral studies and two EEG investigations. Taken together, the results of these studies support the hypothesis that the sentence superiority effect is primarily driven by rapid access to a sentence-level representation via partial word identification processes that operate in parallel over several words.

View Article and Find Full Text PDF

Grainger et al. (2006) were the first to use ERP masked priming to explore the differing contributions of phonological and orthographic representations to visual word processing. Here we adapted their paradigm to examine word processing in deaf readers.

View Article and Find Full Text PDF

Chinese-English bilinguals read paragraphs with language switches using a rapid serial visual presentation paradigm silently while ERPs were measured (Experiment 1) or read them aloud (Experiment 2). Each paragraph was written in either Chinese or English with several function or content words switched to the other language. In Experiment 1, language switches elicited an early, long-lasting positivity when switching from the dominant language to the nondominant language, but when switching to the dominant language, the positivity started later, and was never larger than when switching to the nondominant language.

View Article and Find Full Text PDF

Early postnatal brain development involves complex interactions among maturing neurons and glial cells that drive tissue organization. We previously analyzed gene expression in tissue from the mouse medial nucleus of the trapezoid body (MNTB) during the first postnatal week to study changes that surround rapid growth of the large calyx of Held (CH) nerve terminal. Here, we present genes that show significant changes in gene expression level during the second postnatal week, a developmental timeframe that brackets the onset of airborne sound stimulation and the early stages of myelination.

View Article and Find Full Text PDF

Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL-English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored.

View Article and Find Full Text PDF

Deaf and hearing readers have different access to spoken phonology which may affect the representation and recognition of written words. We used ERPs to investigate how a matched sample of deaf and hearing adults (total n = 90) responded to lexical characteristics of 480 English words in a go/no-go lexical decision task. Results from mixed effect regression models showed a) visual complexity produced small effects in opposing directions for deaf and hearing readers, b) similar frequency effects, but shifted earlier for deaf readers, c) more pronounced effects of orthographic neighborhood density for hearing readers, and d) more pronounced effects of concreteness for deaf readers.

View Article and Find Full Text PDF

Prior research has found that iconicity facilitates sign production in picture-naming paradigms and has effects on ERP components. These findings may be explained by two separate hypotheses: (1) a task-specific hypothesis that suggests these effects occur because visual features of the iconic sign form can map onto the visual features of the pictures, and (2) a semantic feature hypothesis that suggests that the retrieval of iconic signs results in greater semantic activation due to the robust representation of sensory-motor semantic features compared to non-iconic signs. To test these two hypotheses, iconic and non-iconic American Sign Language (ASL) signs were elicited from deaf native/early signers using a picture-naming task and an English-to-ASL translation task, while electrophysiological recordings were made.

View Article and Find Full Text PDF

We examined how readers process content and function words in sentence comprehension with ERPs. Participants read simple declarative sentences using a rapid serial visual presentation (RSVP) with flankers paradigm. Sentences contained either an unexpected semantically anomalous content word, an unexpected syntactically anomalous function word or were well formed with no anomalies.

View Article and Find Full Text PDF

The role of phonology in word recognition has previously been investigated using a masked lexical decision task and transposed letter (TL) nonwords that were either pronounceable (barve) or unpronounceable (brvae). We used event-related potentials (ERPs) to investigate these effects in skilled deaf readers, who may be more sensitive to orthotactic than phonotactic constraints, which are conflated in English. Twenty deaf and twenty hearing adults completed a masked lexical decision task while ERPs were recorded.

View Article and Find Full Text PDF

We compared processing of letter and symbol stimuli presented briefly in the right or left visual field, and either in isolation or surrounded by two flanking characters of the same category. The flankers could be arranged horizontally or vertically. Participants performed a two-alternative forced choice (2AFC) task with the isolated character or the central character in flanked displays as target.

View Article and Find Full Text PDF

Neural tissue maturation is a coordinated process under tight transcriptional control. We previously analyzed the kinetics of gene expression in the medial nucleus of the trapezoid body (MNTB) in the brainstem during the critical postnatal phase of its development. While this work revealed timed execution of transcriptional programs, it was blind to the specific cells where gene expression changes occurred.

View Article and Find Full Text PDF
Article Synopsis
  • Co-activation of semantically related concepts can lead to both interference and facilitation in language tasks, depending on how words are related (taxonomically vs. thematically).
  • Taxonomically related words (like WOLF and DOG) tend to cause interference, whereas thematically related words (like BONE and DOG) can enhance word retrieval.
  • In a recent study using electroencephalography, researchers observed distinct brain responses during word retrieval, indicating that different semantic relationships activate different brain regions and processes simultaneously.
View Article and Find Full Text PDF

Models of visual word recognition differ as to how print exposure modulates orthographic precision. In some models, precision is the optimal end state of a lexical representation; the associations between letters and positions are initially approximate and become more precise as readers gain exposure to the word. In others, flexible orthographic coding that allows for rapid access to semantics (i.

View Article and Find Full Text PDF

Models vary in the extent to which language control processes are domain general. Those that posit that language control is at least partially domain general insist on an overlap between language control and executive control at the goal level. To further probe whether or not language control is domain general, we conducted the first event-related potential (ERP) study that directly compares language-switch costs, as an index of language control, and task-switch costs, as an index of executive control.

View Article and Find Full Text PDF

Form priming has been used to identify and demarcate the processes that underlie word and sign recognition. The facilitation that results from the prime and target being related in form is typically interpreted in terms of pre-activation of linguistic representations, with little to no consideration for the potential contributions of increased perceptual overlap between related pairs. Indeed, isolating the contribution of perceptual similarity is impossible in spoken languages; there are no listeners who can perceive speech but have not acquired a sound-based phonological system.

View Article and Find Full Text PDF

Repetition priming and event-related potentials (ERPs) were used to investigate the time course of sign recognition in deaf users of American Sign Language. Signers performed a go/no-go semantic categorization task to rare probe signs referring to people; critical target items were repeated and unrelated signs. In Experiment 1, ERPs were time-locked either to the onset of the video or to sign onset within the video; in Experiment 2, the same full videos were clipped so that video and sign onset were aligned (removing transitional movements), and ERPs were time-locked to video/sign onset.

View Article and Find Full Text PDF

Event-related potentials (ERPs) were used to explore the effects of iconicity and structural visual alignment between a picture-prime and a sign-target in a picture-sign matching task in American Sign Language (ASL). Half the targets were iconic signs and were presented after a) a matching visually-aligned picture (e.g.

View Article and Find Full Text PDF

It is currently unclear to what degree language control, which minimizes non-target language interference and increases the probability of selecting target-language words, is similar for sign-speech (bimodal) bilinguals and spoken language (unimodal) bilinguals. To further investigate the nature of language control processes in bimodal bilinguals, we conducted the first event-related potential (ERP) language switching study with hearing American Sign Language (ASL)-English bilinguals. The results showed a pattern that has not been observed in any unimodal language switching study: a switch-related positivity over anterior sites and a switch-related negativity over posterior sites during ASL production in both early and late time windows.

View Article and Find Full Text PDF

The picture word interference (PWI) paradigm and ERPs were used to investigate whether lexical selection in deaf and hearing ASL-English bilinguals occurs via lexical competition or whether the response exclusion hypothesis (REH) for PWI effects is supported. The REH predicts that semantic interference should not occur for bimodal bilinguals because sign and word responses do not compete within an output buffer. Bimodal bilinguals named pictures in ASL, preceded by either a translation equivalent, semantically-related, or unrelated English written word.

View Article and Find Full Text PDF

The extent to which higher-order representations can be extracted from more than one word in parallel remains an unresolved issue with theoretical import. Here, we used ERPs to investigate the timing with which semantic information is extracted from parafoveal words. Participants saw animal and non-animal targets paired with response congruent or incongruent flankers in a semantic categorization task.

View Article and Find Full Text PDF

We used transposed-letter (TL) priming to test the lexical tuning hypothesis, which states that words from high-density orthographic neighborhoods have more precise orthographic codes than words from low-density neighborhoods. Replicating standard TL priming effects, target words elicited faster lexical decision responses and smaller amplitude N250s and N400s when preceded by TL primes (e.g.

View Article and Find Full Text PDF

We used phonological priming and ERPs to investigate the organization of the lexicon in American Sign Language. Across go/no-go repetition detection and semantic categorization tasks, targets in related pairs that shared handshape and location elicited smaller N400s than targets in unrelated pairs, indicative of facilitated processing. Handshape-related targets also elicited smaller N400s than unrelated targets, but only in the repetition task.

View Article and Find Full Text PDF

This study used ERPs to a) assess the neural correlates of cross-linguistic, cross-modal translation priming in hearing beginning learners of American Sign Language (ASL) and deaf highly proficient signers and b) examine whether sign iconicity modulates these priming effects. Hearing learners exhibited translation priming for ASL signs preceded by English words (greater negativity for unrelated than translation primes) later in the ERP waveform than deaf signers and exhibited earlier and greater priming for iconic than non-iconic signs. Iconicity did not modulate translation priming effects either behaviorally or in the ERPs for deaf signers (except in a 800-1000 ms time window).

View Article and Find Full Text PDF