Does cross-modal correspondence modulate modality-specific perceptual processing? Study using timing judgment tasks.

Atten Percept Psychophys

Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.

Published: January 2024

Cross-modal correspondences refer to associations between stimulus features across sensory modalities. Previous studies have shown that cross-modal correspondences modulate reaction times for detecting and identifying stimuli in one modality when uninformative stimuli from another modality are present. However, it is unclear whether such modulation reflects changes in modality-specific perceptual processing. We used two psychophysical timing judgment tasks to examine the effects of audiovisual correspondences on visual perceptual processing. In Experiment 1, we conducted a temporal order judgment (TOJ) task that asked participants to judge which of two visual stimuli presented with various stimulus onset asynchronies (SOAs) appeared first. In Experiment 2, we conducted a simultaneous judgment (SJ) task that asked participants to report whether the two visual stimuli were simultaneous or successive. We also presented an unrelated auditory stimulus, simultaneously or preceding the first visual stimulus, and manipulated the congruency between audiovisual stimuli. Experiment 1 indicated that the points of subjective simultaneity (PSSs) between the two visual stimuli estimated in the TOJ task shifted according to the audiovisual correspondence between the auditory pitch and visual features of vertical location and size. However, these audiovisual correspondences did not affect PSS estimated using the SJ task in Experiment 2. The different results of the two tasks can be explained through the response bias triggered by audiovisual correspondence that only the TOJ task included. We concluded that audiovisual correspondence would not modulate visual perceptual timing and that changes in modality-specific perceptual processing might not trigger the congruency effects reported in previous studies.

Download full-text PDF

Source
http://dx.doi.org/10.3758/s13414-023-02812-3DOI Listing

Publication Analysis

Top Keywords

modality-specific perceptual
12
perceptual processing
12
toj task
12
visual stimuli
12
audiovisual correspondence
12
correspondence modulate
8
timing judgment
8
judgment tasks
8
cross-modal correspondences
8
previous studies
8

Similar Publications

Bayesian accounts of perception, such as predictive processing, suggest that perceptions integrate expectations and sensory experience, and thus assimilate to expected values. Furthermore, more precise expectations should have stronger influences on perception. We tested these hypotheses in a paradigm that manipulates both the mean value and the precision of cues within-person.

View Article and Find Full Text PDF

In psychophysiological research, the use of Virtual Reality (VR) for stimulus presentation allows for the investigation of how perceptual processing adapts to varying degrees of realism. Previous time-domain studies have shown that perceptual processing involves modality-specific neural mechanisms, as evidenced by distinct stimulus-locked components. Analyzing induced oscillations across different frequency bands can provide further insights into neural processes that are not strictly phase-locked to stimulus onset.

View Article and Find Full Text PDF

Growing evidence suggests that conceptual knowledge influences emotion perception, yet the neural mechanisms underlying this effect are not fully understood. Recent studies have shown that brain representations of facial emotion categories in visual-perceptual areas are predicted by conceptual knowledge, but it remains to be seen if auditory regions are similarly affected. Moreover, it is not fully clear whether these conceptual influences operate at a modality-independent level.

View Article and Find Full Text PDF

Anticipating multisensory environments: Evidence for a supra-modal predictive system.

Cognition

January 2025

Department of Basic, Development and Education Psychology, Faculty of Psychology, Autonomous University of Barcelona, Spain. Electronic address:

Article Synopsis
  • The study investigates how humans generate predictions in multisensory environments using both auditory and visual stimuli, aiming to determine if these predictions are made through modality-specific mechanisms or a general predictive system.
  • Participants engaged with pairs of predictable auditory and visual stimuli, focusing on one modality while ignoring the other, to assess how expectations influence their perceptual performance.
  • Results indicate that participants performed better on expected targets, with the effect extending to distractors when targets were also expected, suggesting a shared predictive system across sensory modalities.
View Article and Find Full Text PDF

Early work on selective attention used auditory-based tasks, such as dichotic listening, to shed light on capacity limitations and individual differences in these limitations. Today, there is great interest in individual differences in attentional abilities, but the field has shifted towards visual-modality tasks. Furthermore, most conflict-based tests of attention control lack reliability due to low signal-to-noise ratios and the use of difference scores.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!