We exploit the phenomenon of cross-modal, cross-language activation to examine the dynamics of language processing. Previous within-language work showed that seeing a sign coactivates phonologically related signs, just as hearing a spoken word coactivates phonologically related words. In this study, we conducted a series of eye-tracking experiments using the visual world paradigm to investigate the time course of cross-language coactivation in hearing bimodal bilinguals (Spanish-Spanish Sign Language) and unimodal bilinguals (Spanish/Basque).
View Article and Find Full Text PDFSpoken words and signs both consist of structured sub-lexical units. While phonemes unfold in time in the case of the spoken signal, visual sub-lexical units such as location and handshape are produced simultaneously in signs. In the current study we investigate the role of sub-lexical units in lexical access in spoken Spanish and in Spanish Sign Language (LSE) in hearing early bimodal bilinguals and in hearing second language (L2) learners of LSE, both native speakers of Spanish, using the visual world paradigm.
View Article and Find Full Text PDFJ Exp Psychol Learn Mem Cogn
November 2017
This study investigated whether language control during language production in bilinguals generalizes across modalities, and to what extent the language control system is shaped by competition for the same articulators. Using a cued language-switching paradigm, we investigated whether switch costs are observed when hearing signers switch between a spoken and a signed language. The results showed an asymmetrical switch cost for bimodal bilinguals on reaction time (RT) and accuracy, with larger costs for the (dominant) spoken language.
View Article and Find Full Text PDF