To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-state responses that are elicited time-locked to periodically modulated stimuli. Critically, in the frequency domain, interactions between sensory signals are indexed by crossmodulation terms (i.e. the sums and differences of the fundamental frequencies). The 3 × 2 factorial design, manipulated (1) modality: auditory, visual or audiovisual (2) steady-state modulation: the auditory and visual signals were modulated only in one sensory feature (e.g. visual gratings modulated in luminance at 6 Hz) or in two features (e.g. tones modulated in frequency at 40 Hz & amplitude at 0.2 Hz). This design enabled us to investigate crossmodulation frequencies that are elicited when two stimulus features are modulated concurrently (i) in one sensory modality or (ii) in auditory and visual modalities. In support of within-modality integration, we reliably identified crossmodulation frequencies when two stimulus features in one sensory modality were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuroimage.2012.01.114DOI Listing

Publication Analysis

Top Keywords

auditory visual
20
crossmodulation frequencies
16
steady-state responses
8
interactions sensory
8
sensory modalities
8
modality auditory
8
stimulus features
8
sensory modality
8
visual modalities
8
sensory
7

Similar Publications

Age-related hearing loss (ARHL) is considered one of the most common neurodegenerative disorders in the elderly; however, how it contributes to cognitive decline is poorly understood. With resting-state functional magnetic resonance imaging from 66 individuals with ARHL and 54 healthy controls, group spatial independent component analyses, sliding window analyses, graph-theory methods, multilayer networks, and correlation analyses were used to identify ARHL-induced disturbances in static and dynamic functional network connectivity (sFNC/dFNC), alterations in global network switching and their links to cognitive performances. ARHL was associated with decreased sFNC/dFNC within the default mode network (DMN) and increased sFNC/dFNC between the DMN and central executive, salience (SN), and visual networks.

View Article and Find Full Text PDF

Plastic changes in the brain are primarily limited to early postnatal periods. Recovery of adult brain plasticity is critical for the effective development of therapies. A brief (1-2 weeks) duration of visual deprivation (dark exposure, DE) in adult mice can trigger functional plasticity of thalamocortical and intracortical circuits in the primary auditory cortex suggesting improved sound processing.

View Article and Find Full Text PDF

The visual word form area (VWFA) is a region in the left ventrotemporal cortex (VTC) whose specificity remains contentious. Using precision fMRI, we examine the VWFA's responses to numerous visual and nonvisual stimuli, comparing them to adjacent category-selective visual regions and regions involved in language and attentional demand. We find that VWFA responds moderately to non-word visual stimuli, but is unique within VTC in its pronounced selectivity for visual words.

View Article and Find Full Text PDF

Perception and production of music and speech rely on auditory-motor coupling, a mechanism which has been linked to temporally precise oscillatory coupling between auditory and motor regions of the human brain, particularly in the beta frequency band. Recently, brain imaging studies using magnetoencephalography (MEG) have also shown that accurate auditory temporal predictions specifically depend on phase coherence between auditory and motor cortical regions. However, it is not yet clear whether this tight oscillatory phase coupling is an intrinsic feature of the auditory-motor loop, or whether it is only elicited by task demands.

View Article and Find Full Text PDF

Salient Voice Symptoms in Primary Muscle Tension Dysphonia.

J Voice

January 2025

School of Behavioral and Brain Sciences, Department of Speech, Language, and Hearing, Callier Center for Communication Disorders, University of Texas at Dallas, Richardson, TX; Department of Otolaryngology - Head and Neck Surgery, University of Texas Southwestern Medical Center, Dallas, TX. Electronic address:

Introduction: Patients with primary muscle tension dysphonia (pMTD) commonly report symptoms of vocal effort, fatigue, discomfort, odynophonia, and aberrant vocal quality (eg, vocal strain, hoarseness). However, voice symptoms most salient to pMTD have not been identified. Furthermore, how standard vocal fatigue and vocal tract discomfort indices that capture persistent symptoms-like the Vocal Fatigue Index (VFI) and Vocal Tract Discomfort Scale (VTDS)-relate to acute symptoms experienced at the time of the voice evaluation is unclear.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!