Sounds enhance our ability to detect, localize, and respond to co-occurring visual targets. Research suggests that sounds improve visual processing by resetting the phase of ongoing oscillations in visual cortex. However, it remains unclear what information is relayed from the auditory system to visual areas and if sounds modulate visual activity even in the absence of visual stimuli (e.
View Article and Find Full Text PDFMultisensory stimuli speed behavioral responses, but the mechanisms subserving these effects remain disputed. Historically, the observation that multisensory reaction times (RTs) outpace models assuming independent sensory channels has been taken as evidence for multisensory integration (the "redundant target effect"; RTE). However, this interpretation has been challenged by alternative explanations based on stimulus sequence effects, RT variability, and/or negative correlations in unisensory processing.
View Article and Find Full Text PDFSpeech perception is a central component of social communication. Although principally an auditory process, accurate speech perception in everyday settings is supported by meaningful information extracted from visual cues. Visual speech modulates activity in cortical areas subserving auditory speech perception including the superior temporal gyrus (STG).
View Article and Find Full Text PDFAging is associated with widespread alterations in cerebral white matter (WM). Most prior studies of age differences in WM have used diffusion tensor imaging (DTI), but typical DTI metrics (e.g.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
July 2020
Visual speech facilitates auditory speech perception, but the visual cues responsible for these benefits and the information they provide remain unclear. Low-level models emphasize basic temporal cues provided by mouth movements, but these impoverished signals may not fully account for the richness of auditory information provided by visual speech. High-level models posit interactions among abstract categorical (i.
View Article and Find Full Text PDFObjective: Postmortem analysis of the brain from a blind human subject who had a cortical visual prosthesis implanted for 36 years (Dobelle 2000 Asaio J. 46 3–9)
Approach: This provided insight into the design requirements for a successful human cortical visual prosthesis by revealing, (a) unexpected rotation of the electrode array 25 to 40 degrees away from the midsagittal plane, thought to be due to the torque of the connecting cable, (b) degradation of the platinum electrodes, and (c) only partial coverage of the primary visual cortex by the rectangular array. The electrode array only overlapped with the anterior 45% of primary visual cortex (identified by the line of Gennari), largely missing the posterior foveal representation of visual cortex.
Antisocial behavior (AB), including violence, criminality, and substance abuse, is often linked to deficits in emotion processing, reward-related learning, and inhibitory control, as well as their associated neural networks. To better understand these deficits, the structural connections between brain regions implicated in AB can be examined using diffusion tensor imaging (DTI), which assesses white matter microstructure. Prior studies have identified differences in white matter microstructure of the uncinate fasciculus (UF), primarily within offender samples.
View Article and Find Full Text PDFCo-occurring sounds can facilitate perception of spatially and temporally correspondent visual events. Separate lines of research have identified two putatively distinct neural mechanisms underlying two types of crossmodal facilitations: Whereas crossmodal phase resetting is thought to underlie enhancements based on temporal correspondences, lateralized occipital evoked potentials (ERPs) are thought to reflect enhancements based on spatial correspondences. Here, we sought to clarify the relationship between these two effects to assess whether they reflect two distinct mechanisms or, rather, two facets of the same underlying process.
View Article and Find Full Text PDFCognition in action requires strategic allocation of attention between internal processes and the sensory environment. We hypothesized that this resource allocation could be facilitated by mechanisms that predict sensory results of self-generated actions. Sensory signals conforming to predictions would be safely ignored to facilitate focus on internally generated content, whereas those violating predictions would draw attention for additional scrutiny.
View Article and Find Full Text PDFMultisensory integration can play a critical role in producing unified and reliable perceptual experience. When sensory information in one modality is degraded or ambiguous, information from other senses can crossmodally resolve perceptual ambiguities. Prior research suggests that auditory information can disambiguate the contents of visual awareness by facilitating perception of intermodally consistent stimuli.
View Article and Find Full Text PDFPlasticity is essential in body perception so that physical changes in the body can be accommodated and assimilated. Multisensory integration of visual, auditory, tactile, and proprioceptive signals contributes both to conscious perception of the body's current state and to associated learning. However, much is unknown about how novel information is assimilated into body perception networks in the brain.
View Article and Find Full Text PDF