Phonotactic Constraints Are Activated across Languages in Bilinguals.

Front Psychol

Bilingualism and Psycholinguistics Research Group, Roxelyn and Richard Pepper Department of Communication Sciences and Disorders, Northwestern University Evanston, IL, USA.

Published: May 2016

During spoken language comprehension, auditory input activates a bilingual's two languages in parallel based on phonological representations that are shared across languages. However, it is unclear whether bilinguals access phonotactic constraints from the non-target language during target language processing. For example, in Spanish, words with s+ consonant onsets cannot exist, and phonotactic constraints call for epenthesis (addition of a vowel, e.g., stable/estable). Native Spanish speakers may produce English words such as estudy ("study") with epenthesis, suggesting that these bilinguals apply Spanish phonotactic constraints when speaking English. The present study is the first to examine whether bilinguals access Spanish phonotactic constraints during English comprehension. In an English cross-modal priming lexical decision task, Spanish-English bilinguals and English monolinguals heard English cognate and non-cognate primes containing s+ consonant onsets or controls without s+ onsets, followed by a lexical decision on visual targets with the /e/ phonotactic constraint or controls without /e/. Results revealed that bilinguals were faster to respond to /es/ non-word targets preceded by s+ cognate primes and /es/ and /e/ non-word targets preceded by s+ non-cognate primes, confirming that English primes containing s+ onsets activated Spanish phonotactic constraints. These findings are discussed within current accounts of parallel activation of two languages during bilingual spoken language comprehension, which may be expanded to include activation of phonotactic constraints from the irrelevant language.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4870387PMC
http://dx.doi.org/10.3389/fpsyg.2016.00702DOI Listing

Publication Analysis

Top Keywords

phonotactic constraints
28
spanish phonotactic
12
phonotactic
8
spoken language
8
language comprehension
8
bilinguals access
8
consonant onsets
8
lexical decision
8
non-cognate primes
8
non-word targets
8

Similar Publications

Review of the Phonological System of Contemporary Urdu Spoken in Pakistan.

Int J Speech Lang Pathol

February 2025

Academic Unit of Human Communication, Learning, and Development (HCLD), The University of Hong Kong, Pok Fu Lam, Hong Kong.

Purpose: Urdu is the lingua franca and national language of Pakistan, and is the 10th most-spoken language worldwide with over 230 million speakers. The Urdu phonological system has been examined over the past decades. However, the system has been evolving.

View Article and Find Full Text PDF

Infants' preference for vowel harmony (VH, a phonotactic constraint that requires vowels in a word to be featurally similar) is thought to be language-specific: Monolingual infants learning VH languages show a listening preference for VH patterns by 6 months of age, while those learning non-VH languages do not (Gonzalez-Gomez et al., 2019; Van Kampen et al., 2008).

View Article and Find Full Text PDF

This article investigates the role of phonological well-formedness constraints in Mandarin speakers' phonotactic grammar and how they affect online speech processing. Mandarin non-words can be categorized into systematic gaps and accidental gaps, depending on whether they violate principled phonotactic constraints based on the Obligatory Contour Principle (OCP). Non-word acceptability judgment experiments have shown that systematic gaps received lower wordlikeness ratings than accidental gaps.

View Article and Find Full Text PDF

The experimental study of artificial language learning has become a widely used means of investigating the predictions of theories of language learning and representation. Although much is now known about the generalizations that learners make from various kinds of data, relatively little is known about how those representations affect speech processing. This paper presents an event-related potential (ERP) study of brain responses to violations of lab-learned phonotactics.

View Article and Find Full Text PDF

The role of phonology in word recognition has previously been investigated using a masked lexical decision task and transposed letter (TL) nonwords that were either pronounceable (barve) or unpronounceable (brvae). We used event-related potentials (ERPs) to investigate these effects in skilled deaf readers, who may be more sensitive to orthotactic than phonotactic constraints, which are conflated in English. Twenty deaf and twenty hearing adults completed a masked lexical decision task while ERPs were recorded.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!