Publications by authors named "Rachel Mayberry"

Research shows that insufficient language access in early childhood significantly affects language processing. While the majority of this work focuses on syntax, phonology also appears to be affected, though it is unclear exactly how. Here we investigated phonological production across age of acquisition of American Sign Language (ASL).

View Article and Find Full Text PDF

Research on the language acquisition of deaf individuals who are exposed to accessible linguistic input at a variety of ages has provided evidence for a sensitive period of first language acquisition. Recent studies have shown that deaf individuals who first learn language after early childhood, late first-language learners (LL1), do not comprehend reversible Subject-Verb-Object (SVO) sentences. The present study analyzed 478 signed productions elicited with pictures depicting simple events with one or two arguments by 28 signers.

View Article and Find Full Text PDF

The hypothesis that impoverished language experience affects complex sentence structure development around the end of early childhood was tested using a fully randomized, sentence-to-picture matching study in American Sign Language (ASL). The participants were ASL signers who had impoverished or typical access to language in early childhood. Deaf signers whose access to language was highly impoverished in early childhood (N = 11) primarily comprehended structures consisting of a single verb and argument (Subject or Object), agreeing verbs, and the spatial relation or path of semantic classifiers.

View Article and Find Full Text PDF

Multiple studies have reported mathematics underachievement for students who are deaf, but the onset, scope, and causes of this phenomenon remain understudied. Early language deprivation might be one factor influencing the acquisition of numbers. In this study, we investigated a basic and fundamental mathematical skill, automatic magnitude processing, in two formats (Arabic digits and American Sign Language number signs) and the influence of age of first language exposure on both formats by using two versions of the Number Stroop Test.

View Article and Find Full Text PDF

Due to the ubiquitous nature of language in the environment of infants, how it affects the anatomical structure of the brain language system over the lifespan is not well understood. In this study, we investigated the effects of early language experience on the adult brain by examining anatomical features of individuals born deaf with typical or restricted language experience in early childhood. Twenty-two deaf adults whose primary language was American Sign Language and were first immersed in it at ages ranging from birth to 14 y participated.

View Article and Find Full Text PDF

Spoken language research has investigated how pronouns are influenced by grammar and semantics/pragmatics. In contrast, sign language research has focused on unambiguous pronominal reference arising from spatial co-reference. However, understanding signed pronouns contributes to cross-linguistically valid models of pronoun production and comprehension.

View Article and Find Full Text PDF

Areas within the left-lateralized neural network for language have been found to be sensitive to syntactic complexity in spoken and written language. Previous research has revealed that these areas are active for sign language as well, but whether these areas are specifically responsive to syntactic complexity in sign language independent of lexical processing has yet to be found. To investigate the question, we used fMRI to neuroimage deaf native signers' comprehension of 180 sign strings in American Sign Language (ASL) with a picture-probe recognition task.

View Article and Find Full Text PDF

Implicit causality (IC) biases, the tendency of certain verbs to elicit re-mention of either the first-mentioned noun phrase (NP1) or the second-mentioned noun phrase (NP2) from the previous clause, are important in psycholinguistic research. Understanding IC verbs and the source of their biases in signed as well as spoken languages helps elucidate whether these phenomena are language general or specific to the spoken modality. As the first of its kind, this study investigates IC biases in American Sign Language (ASL) and provides IC bias norms for over 200 verbs, facilitating future psycholinguistic studies of ASL and comparisons of spoken versus signed languages.

View Article and Find Full Text PDF

Limited language experience in childhood is common among deaf individuals, which prior research has shown to lead to low levels of language processing. Although basic structures such as word order have been found to be resilient to conditions of sparse language input in early life, whether they are robust to conditions of extreme language delay is unknown. The sentence comprehension strategies of post-childhood, first-language (L1) learners of American Sign Language (ASL) with at least 9 years of language experience were investigated, in comparison to two control groups of learners with full access to language from birth (deaf native signers and hearing L2 learners who were native English speakers).

View Article and Find Full Text PDF

Previous research on reference tracking has revealed a tendency towards over-explicitness in second language (L2) learners. Only limited evidence exists that this trend extends to situations where the learner's first and second languages do not share a sensory-motor modality. Using a story-telling paradigm, this study examined how hearing novice L2 learners accomplish reference tracking in American Sign Language (ASL), and whether they transfer strategies from gesture.

View Article and Find Full Text PDF

Previous research has identified ventral and dorsal white matter tracts as being crucial for language processing; their maturation correlates with increased language processing capacity. Unknown is whether the growth or maintenance of these language-relevant pathways is shaped by language experience in early life. To investigate the effects of early language deprivation and the sensory-motor modality of language on white matter tracts, we examined the white matter connectivity of language-relevant pathways in congenitally deaf people with or without early access to language.

View Article and Find Full Text PDF

This paper looks at numeral incorporation in Russian Sign Language (RSL). Numeral incorporation is the simultaneous combination of a numeral and a base sign into one sign. Incorporating forms typically use the numerical handshape combined simultaneously with the movement, location, and orientation of the base lexical sign; for example, "three months" will be expressed through an incorporating form 3_month.

View Article and Find Full Text PDF

Previous studies suggest that age of acquisition affects the outcomes of learning, especially at the morphosyntactic level. Unknown is how syntactic development is affected by increased cognitive maturity and delayed language onset. The current paper studied the early syntactic development of adolescent first language learners by examining word order patterns in American Sign Language (ASL).

View Article and Find Full Text PDF

Prediction during sign language comprehension may enable signers to integrate linguistic and non-linguistic information within the visual modality. In two eyetracking experiments, we investigated American Sign language (ASL) semantic prediction in deaf adults and children (aged 4-8 years). Participants viewed ASL sentences in a visual world paradigm in which the sentence-initial verb was either neutral or constrained relative to the sentence-final target noun.

View Article and Find Full Text PDF

The extent to which development of the brain language system is modulated by the temporal onset of linguistic experience relative to post-natal brain maturation is unknown. This crucial question cannot be investigated with the hearing population because spoken language is ubiquitous in the environment of newborns. Deafness blocks infants' language experience in a spoken form, and in a signed form when it is absent from the environment.

View Article and Find Full Text PDF

The target article's call to end reliance on acceptability judgments is premature. First, it restricts syntactic inquiry to cases were a semantically equivalent alternative is available. Second, priming studies require groups of participants who are linguistically homogenous and whose grammar is known to the researcher.

View Article and Find Full Text PDF

In this reply to Salverda (2016), we address a critique of the claims made in our recent study of real-time processing of American Sign Language (ASL) signs using a novel visual world eye-tracking paradigm (Lieberman, Borovsky, Hatrak, & Mayberry, 2015). Salverda asserts that our data do not support our conclusion that native signers and late-learning signers show variable patterns of activation in the presence of phonological competitors. We provide a logical rationale for our study design and present a reanalysis of our data using a modified time window, providing additional evidence for our claim.

View Article and Find Full Text PDF

Discussions of reference tracking in spoken languages often invoke some version of a referential hierarchy. In this paper, we asked whether this hierarchy applies equally well to reference tracking in a visual language, American Sign Language, or whether modality differences influence its structure. Expanding the results of previous studies, this study looked at ASL referential devices beyond nouns, pronouns, and zero anaphora.

View Article and Find Full Text PDF

Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL).

View Article and Find Full Text PDF

All natural languages develop devices to communicate who did what to whom. Elicited pantomime provides one model for studying this process, by providing a window into how humans (hearing non-signers) behave in a natural communicative modality (silent gesture) without established conventions from a grammar. Most studies in this paradigm focus on production, although they sometimes make assumptions about how comprehenders would likely behave.

View Article and Find Full Text PDF

Language acquisition involves learning not only grammatical rules and a lexicon, but also what someone is intending to convey with their utterance: the semantic/pragmatic component of language. In this paper we separate the contributions of linguistic development and cognitive maturity to the acquisition of the semantic/pragmatic component of language by comparing deaf adults who had either early or late first exposure to their first language (ASL). We focus on the particular type of meaning at the semantic/pragmatic interface called , for which preschool-age children typically differ from adults.

View Article and Find Full Text PDF

Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing.

View Article and Find Full Text PDF

One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI.

View Article and Find Full Text PDF