AI Article Synopsis

  • Bimodal bilinguals, fluent in both signed and spoken languages, were studied to see how they use American Sign Language (ASL) facial expressions while speaking English.
  • The study found that these bilinguals used more ASL-related facial expressions and synchronized them with their spoken English, indicating that they integrate grammatical information from both languages effectively.
  • Participants used more raised eyebrows than furrowed ones, suggesting that while they can suppress non-target language expressions, they struggle to completely inhibit the influence of ASL facial grammar.

Article Abstract

Bimodal bilinguals, fluent in a signed and a spoken language, provide unique insight into the nature of syntactic integration and language control. We investigated whether bimodal bilinguals who are conversing with English monolinguals produce American Sign Language (ASL) grammatical facial expressions to accompany parallel syntactic structures in spoken English. In ASL, raised eyebrows mark conditionals, and furrowed eyebrows mark wh-questions; the grammatical brow movement is synchronized with the manual onset of the clause. Bimodal bilinguals produced more ASL-appropriate facial expressions than did nonsigners and synchronized their expressions with the onset of the corresponding English clauses. This result provides evidence for a dual-language architecture in which grammatical information can be integrated up to the level of phonological implementation. Overall, participants produced more raised brows than furrowed brows, which can convey negative affect. Bimodal bilinguals suppressed but did not completely inhibit ASL facial grammar when it conflicted with conventional facial gestures. We conclude that morphosyntactic elements from two languages can be articulated simultaneously and that complete inhibition of the nonselected language is difficult.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2632943PMC
http://dx.doi.org/10.1111/j.1467-9280.2008.02119.xDOI Listing

Publication Analysis

Top Keywords

bimodal bilinguals
16
american sign
8
sign language
8
english monolinguals
8
facial expressions
8
eyebrows mark
8
language
5
bilinguals
5
face bimodal
4
bimodal bilingualism
4

Similar Publications

In perceptual studies, musicality and pitch aptitude have been implicated in tone learning, while vocabulary size has been implicated in distributional (segment) learning. Moreover, working memory plays a role in the overnight consolidation of explicit-declarative L2 learning. This study examines how these factors uniquely account for individual differences in the distributional learning and consolidation of an L2 tone contrast, where learners are tonal language speakers, and the training is implicit.

View Article and Find Full Text PDF

Bimodal aphasia and dysgraphia: Phonological output buffer aphasia and orthographic output buffer dysgraphia in spoken and sign language.

Cortex

January 2025

Language and Brain Lab, Sagol School of Neuroscience, and School of Education, Tel Aviv University, Tel Aviv, Israel. Electronic address:

We report a case of crossmodal bilingual aphasia-aphasia in two modalities, spoken and sign language-and dysgraphia in both writing and fingerspelling. The patient, Sunny, was a 42 year-old woman after a left temporo-parietal stroke, a speaker of Hebrew, Romanian, and English and an adult learner, daily user of Israeli Sign language (ISL). We assessed Sunny's spoken and sign languages using a comprehensive test battery of naming, reading, and repetition tasks, and also analysed her spontaneous-speech and sign.

View Article and Find Full Text PDF

The macrostructure of narratives produced by children acquiring Finnish Sign Language.

J Deaf Stud Deaf Educ

November 2024

Department of Language and Communication Studies, University of Jyväskylä, Seminaarinkatu 15, PO Box 35, FI-40014, Jyväskylä, Finland.

This article investigates the narrative skills of children acquiring Finnish Sign Language (FinSL). Producing a narrative requires vocabulary, the ability to form sentences, and cognitive skills to construct actions in a logical order for the recipient to understand the story. Research has shown that narrative skills are an excellent way of observing a child's language skills, for they reflect both grammatical language skills and the ability to use the language in situationally appropriate ways.

View Article and Find Full Text PDF

Neural changes in sign language vocabulary learning: Tracking lexical integration with ERP measures.

Brain Lang

December 2024

Department of Cognition, Development and Educational Psychology, Institut de Neurociències, Universitat de Barcelona, Spain. Electronic address:

The present study aimed to investigate the neural changes related to the early stages of sign language vocabulary learning. Hearing non-signers were exposed to Catalan Sign Language (LSC) signs in three laboratory learning sessions over the course of a week. Participants completed two priming tasks designed to examine learning-related neural changes by means of N400 responses.

View Article and Find Full Text PDF

Distributional learning of bimodal and trimodal phoneme categories in monolingual and bilingual infants.

Infant Behav Dev

December 2024

Department of Speech-Language-Hearing: Sciences and Disorders, University of Kansas, USA. Electronic address:

Distributional learning has been proposed as a mechanism for infants to learn the native phonemes of the language(s) to which they are exposed. When hearing two speech streams, bilingual infants may find other strategies more useful and rely on distributional learning less than monolingual infants. A series of studies examined how bilingual language experience affects the application of the distributional learning to novel phoneme distributions.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!