Any series of sensorimotor actions shows fluctuations in speed and accuracy from repetition to repetition, even when the sensory input and motor output requirements remain identical over time. Such fluctuations are particularly prominent in reaction time (RT) series from laboratory neurocognitive tasks. Despite their omnipresent nature, trial-to-trial fluctuations remain poorly understood.
View Article and Find Full Text PDFDecoding human speech requires the brain to segment the incoming acoustic signal into meaningful linguistic units, ranging from syllables and words to phrases. Integrating these linguistic constituents into a coherent percept sets the root of compositional meaning and hence understanding. One important cue for segmentation in natural speech is prosodic cues, such as pauses, but their interplay with higher-level linguistic processing is still unknown.
View Article and Find Full Text PDFJ Neurophysiol
April 2024
The brain engages the processes of multisensory integration and recalibration to deal with discrepant multisensory signals. These processes consider the reliability of each sensory input, with the more reliable modality receiving the stronger weight. Sensory reliability is typically assessed via the variability of participants' judgments, yet these can be shaped by factors both external and internal to the nervous system.
View Article and Find Full Text PDFSustainability is playing an increasingly important role in analysts' assessments of companies. Companies can address this importance through a sustainability strategy by choosing between a sustainability strategy independent from the corporate strategy (standalone sustainability strategy) or integrating sustainability into their corporate strategy (integrated strategy). For this purpose, we investigate the effects of different stages of sustainability integration into the corporate strategy on analysts' perceptions and buy recommendations.
View Article and Find Full Text PDFStudies on multisensory perception often focus on simplistic conditions in which one single stimulus is presented per modality. Yet, in everyday life, we usually encounter multiple signals per modality. To understand how multiple signals within and across the senses are combined, we extended the classical audio-visual spatial ventriloquism paradigm to combine two visual stimuli with one sound.
View Article and Find Full Text PDFHearing is an active process, and recent studies show that even the ear is affected by cognitive states or motor actions. One example are movements of the eardrum induced by saccadic eye movements, known as "eye movement-related eardrum oscillations" (EMREOs). While these are systematically shaped by the direction and size of saccades, the consequences of saccadic eye movements and their resulting EMREOs for hearing remain unclear.
View Article and Find Full Text PDFMultisensory integration and recalibration are two processes by which perception deals with discrepant signals. Both are often studied in the spatial ventriloquism paradigm. There, integration is probed by the presentation of discrepant audio-visual stimuli, while recalibration manifests as an aftereffect in subsequent judgements of unisensory sounds.
View Article and Find Full Text PDFPerceptual coherence in the face of discrepant multisensory signals is achieved via the processes of multisensory integration, recalibration and sometimes motor adaptation. These supposedly operate on different time scales, with integration reducing immediate sensory discrepancies and recalibration and motor adaptation reflecting the cumulative influence of their recent history. Importantly, whether discrepant signals are bound during perception is guided by the brains' inference of whether they originate from a common cause.
View Article and Find Full Text PDFCrossmodal correspondences describe our tendency to associate sensory features from different modalities with each other, such as the pitch of a sound with the size of a visual object. While such crossmodal correspondences (or associations) are described in many behavioural studies their neurophysiological correlates remain unclear. Under the current working model of multisensory perception both a low- and a high-level account seem plausible.
View Article and Find Full Text PDFInformation about the position of our hand is provided by multisensory signals that are often not perfectly aligned. Discrepancies between the seen and felt hand position or its movement trajectory engage the processes of ) multisensory integration, ) sensory recalibration, and ) motor adaptation, which adjust perception and behavioral responses to apparently discrepant signals. To foster our understanding of the coemergence of these three processes, we probed their short-term dependence on multisensory discrepancies in a visuomotor task that has served as a model for multisensory perception and motor control previously.
View Article and Find Full Text PDFPrevious studies have reported correlates of bodily self-illusions such as the rubber hand in signatures of rhythmic brain activity. However, individual studies focused on specific variations of the rubber hand paradigm, used different experimental setups to induce this, or used different control conditions to isolate the neurophysiological signatures related to the illusory state, leaving the specificity of the reported illusion-signatures unclear. We here quantified correlates of the rubber hand illusion in EEG-derived oscillatory brain activity and asked two questions: which of the observed correlates are robust to the precise nature of the control conditions used as contrast for the illusory state, and whether such correlates emerge directly around the subjective illusion onset.
View Article and Find Full Text PDFSpeech is an intrinsically multisensory signal, and seeing the speaker's lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes.
View Article and Find Full Text PDFIntroduction: Due to the changes in the indication range for cochlear implants and the demographic development towards an aging society, more and more people are in receipt of cochlear implants. An implantation requires a close-meshed audiological and logopedic aftercare. Hearing therapy rehabilitation currently requires great personnel effort and is time consuming.
View Article and Find Full Text PDFWhether two sensory cues interact during perceptual judgements depends not only on their immediate properties, but also the overall context in which these are encountered. While in many experiments this context is fixed, in real life multisensory perception must adapt to the momentary environment. To understand the adaptive nature of human multisensory perception we investigated spatial judgements in a ventriloquism paradigm: on different days we exposed observers to audio-visual stimuli whose discrepancy either varied over a wider (± 46°) or a narrower range (± 26°) and hypothesized that exposure to a wider range would foster the multisensory binding of these signals.
View Article and Find Full Text PDFBehavioural and electrophysiological studies point to an apparent influence of the state of respiration, i.e., whether we inhale or exhale, on brain activity and cognitive performance.
View Article and Find Full Text PDFThe neurophysiological processes reflecting body illusions such as the rubber hand remain debated. Previous studies investigating the neural responses evoked by the illusion-inducing stimulation have provided diverging reports as to when these responses reflect the illusory state of the artificial limb becoming embodied. One reason for these diverging reports may be that different studies contrasted different experimental conditions to isolate potential correlates of the illusion, but individual contrasts may reflect multiple facets of the adopted experimental paradigm and not just the illusory state.
View Article and Find Full Text PDFTo organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint.
View Article and Find Full Text PDFIn postdiction, the last stimulus of a sequence changes the perception of the preceding stimuli. Postdiction has been reported in all sensory modalities, but its neural underpinnings remain poorly understood. In the rabbit illusion, a sequence of nonequidistant stimuli presented isochronously is perceived as equidistantly spaced.
View Article and Find Full Text PDFThe representation of speech in the brain is often examined by measuring the alignment of rhythmic brain activity to the speech envelope. To conveniently quantify this alignment (termed 'speech tracking') many studies consider the broadband speech envelope, which combines acoustic fluctuations across the spectral range. Using EEG recordings, we show that using this broadband envelope can provide a distorted picture on speech encoding.
View Article and Find Full Text PDFMany studies speak in favor of a rhythmic mode of listening, by which the encoding of acoustic information is structured by rhythmic neural processes at the time scale of about 1 to 4 Hz. Indeed, psychophysical data suggest that humans sample acoustic information in extended soundscapes not uniformly, but weigh the evidence at different moments for their perceptual decision at the time scale of about 2 Hz. We here test the critical prediction that such rhythmic perceptual sampling is directly related to the state of ongoing brain activity prior to the stimulus.
View Article and Find Full Text PDFThe manner in which humans exploit multisensory information for subsequent decisions changes with age. Multiple causes for such age-effects are being discussed, including a reduced precision in peripheral sensory representations, changes in cognitive inference about causal relations between sensory cues, and a decline in memory contributing to altered sequential patterns of multisensory behaviour. To dissociate these putative contributions, we investigated how healthy young and older adults integrate audio-visual spatial information within trials (the ventriloquism effect) and between trials (the ventriloquism aftereffect).
View Article and Find Full Text PDFOur senses often receive conflicting multisensory information, which our brain reconciles by adaptive recalibration. A classic example is the ventriloquism aftereffect, which emerges following both cumulative (long-term) and trial-wise exposure to spatially discrepant multisensory stimuli. Despite the importance of such adaptive mechanisms for interacting with environments that change over multiple timescales, it remains debated whether the ventriloquism aftereffects observed following trial-wise and cumulative exposure arise from the same neurophysiological substrate.
View Article and Find Full Text PDFOur brain adapts to discrepancies in the sensory inputs. One example is provided by the ventriloquism effect, experienced when the sight and sound of an object are displaced. Here the discrepant multisensory stimuli not only result in a biased localization of the sound, but also recalibrate the perception of subsequent unisensory acoustic information in the so-called ventriloquism aftereffect.
View Article and Find Full Text PDFDespite recent progress in understanding multisensory decision-making, a conclusive mechanistic account of how the brain translates the relevant evidence into a decision is lacking. Specifically, it remains unclear whether perceptual improvements during rapid multisensory decisions are best explained by sensory (i.e.
View Article and Find Full Text PDFVisual speech carried by lip movements is an integral part of communication. Yet, it remains unclear in how far visual and acoustic speech comprehension are mediated by the same brain regions. Using multivariate classification of full-brain MEG data, we first probed where the brain represents acoustically and visually conveyed word identities.
View Article and Find Full Text PDF