To investigate the basis of crossmodal visual distractor congruency effects, we recorded event-related brain potentials (ERP) while participants performed a tactile location-discrimination task. Participants made speeded tactile location-discrimination responses to tactile targets presented to the index fingers or thumbs while ignoring simultaneously presented task-irrelevant visual distractor stimuli at either the same (congruent) or a different (incongruent) location. Behavioural results were in line with previous studies, showing slowed response times and increased error rates on incongruent compared with congruent visual distractor trials. To clarify the effect of visual distractors on tactile processing, concurrently recorded ERPs were analyzed for poststimulus, preresponse, and postresponse modulations. An enhanced negativity was found in the time range of the N2 component on incongruent compared with congruent visual distractor trials prior to correct responses. In addition, postresponse ERPs showed the presence of error-related negativity components on incorrect-response trials and enhanced negativity for congruent-incorrect compared with incongruent-incorrect trials. This pattern of ERP results has previously been related to response conflict (Yeung, Botvinick, & Cohen, 2004). Importantly, no modulation of early somatosensory ERPs was present prior to the N2 time range, which may have suggested the contribution of other perceptual or postperceptual processes to crossmodal congruency effects. Taken together, our results suggest that crossmodal visual distractor effects are largely due to response conflict.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.3758/cabn.8.1.65 | DOI Listing |
Behav Sci (Basel)
January 2025
Faculty of Psychology, Tianjin Normal University, Tianjin 300387, China.
The attentional control settings (ACSs) can help us efficiently select targets in complex real-world environments. Previous research has shown that category-specific ACS demands more attentional resources than feature-specific ACS. However, comparing natural or alphanumeric categories with color features does not distinguish the effects of processing hierarchy and target-defining properties.
View Article and Find Full Text PDFBr J Vis Impair
September 2024
The Laboratory for Visual Neuroplasticity, Department of Ophthalmology, Massachusetts Eye and Ear, Harvard Medical School, Boston, USA.
Cerebral visual impairment (CVI) is a brain-based visual disorder associated with injury and/or maldevelopment of central visual pathways. Visuospatial processing impairments are a cardinal feature of the complex clinical profile of individuals with CVI. Here, we assessed visuospatial processing abilities using a classic conjunction search task.
View Article and Find Full Text PDFeNeuro
January 2025
Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
Observing lip movements of a speaker facilitates speech understanding, especially in challenging listening situations. Converging evidence from neuroscientific studies shows stronger neural responses to audiovisual stimuli compared to audio-only stimuli. However, the interindividual variability of this contribution of lip movement information and its consequences on behavior are unknown.
View Article and Find Full Text PDFCurr Biol
January 2025
Department of Psychology, New York University, New York, NY 10003, USA; Center for Neural Science, New York University, New York, NY 10003, USA. Electronic address:
In human adults, visual perception varies throughout the visual field. Performance decreases with eccentricity and varies around polar angle. At isoeccentric locations, performance is typically higher along the horizontal than vertical meridian (horizontal-vertical asymmetry [HVA]) and along the lower than the upper vertical meridian (vertical meridian asymmetry [VMA]).
View Article and Find Full Text PDFAtten Percept Psychophys
January 2025
School of Allied Health and Communicative Disorders, Northern Illinois University, DeKalb, IL, USA.
Speechreading-gathering speech information from talkers' faces-supports speech perception when speech acoustics are degraded. Benefitting from speechreading, however, requires listeners to visually fixate talkers during face-to-face interactions. The purpose of this study is to test the hypothesis that preschool-aged children allocate their eye gaze to a talker when speech acoustics are degraded.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!