Visual cortex entrains to sign language.

Proc Natl Acad Sci U S A

Department of Psychology, University of Chicago, Chicago, IL 60637;

Published: June 2017

AI Article Synopsis

  • People can learn to understand any human language, whether spoken or signed, and this involves neural mechanisms that help process languages across different sensory modalities.
  • Electrophysiological oscillations in the auditory cortex respond to slow fluctuations in the acoustic envelope when listening to speech, suggesting both specialized auditory mechanisms and a general cortical ability to process rhythmic information.
  • Research using EEG with fluent American Sign Language (ASL) speakers shows significant neural entrainment to visual changes in sign language, particularly at lower frequencies, indicating that similar processing mechanisms apply to both signed and spoken languages, highlighting a broader cortical flexibility in language comprehension.

Article Abstract

Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5474824PMC
http://dx.doi.org/10.1073/pnas.1620350114DOI Listing

Publication Analysis

Top Keywords

sign language
24
language
8
visual change
8
sign
6
visual
5
visual cortex
4
cortex entrains
4
entrains sign
4
language despite
4
despite immense
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!