Efficient learning requires allocating limited attentional resources to meaningful stimuli and away from irrelevant stimuli. This prioritization may occur via covert attention, evident in the activity of the visual cortex. We used steady-state visual evoked potentials (SSVEPs) to assess whether associability-driven changes in stimulus processing were evident in visuocortical responses. Participants were trained on a learned-predictiveness protocol, whereby one stimulus on each trial accurately predicted the correct response for that trial, and the other was irrelevant. In a second phase the task was arranged so that all cues were objectively predictive. Participants' overt attention (eye gaze) was affected by each cue's reinforcement history, as was their covert attention (SSVEP responses). These biases persisted into Phase 2 when all stimuli were objectively predictive, thereby demonstrating that learned attentional processes are evident in basic sensory processing, and exert an effect on covert attention above and beyond the effects of overt gaze bias.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.biopsycho.2020.108004 | DOI Listing |
While research on auditory attention in complex acoustical environment is a thriving field, experimental studies thus far have typically treated participants as passive listeners. The present study-which combined real-time covert loudness manipulations and online probe detection-investigates for the first time to our knowledge, the effects of acoustic salience on auditory attention during live interactions, using musical improvisation as an experimental paradigm. We found that musicians were more likely to pay attention to a given co-performer when this performer was made sounding louder or softer; that such salient effect was not owing to the local variations introduced by our manipulations but rather likely to be driven by the more long-term context; and that improvisers tended to be more strongly and more stably coupled when a musician was made more salient.
View Article and Find Full Text PDFJ Neurosci
January 2025
Université Paris Cité, CNRS, Integrative Neuroscience and Cognition Center, F-75006 Paris, France.
Attention is key to perception and human behavior, and evidence shows that it periodically samples sensory information (<20Hz). However, this view has been recently challenged due to methodological concerns and gaps in our understanding of the function and mechanism of rhythmic attention. Here we used an intensive ∼22-hour psychophysical protocol combined with reverse correlation analyses to infer the neural representation underlying these rhythms.
View Article and Find Full Text PDFCrit Care Med
December 2024
Department of Neurology, Northwestern University, Chicago, IL.
Objectives: To determine whether cognitive impairments of important severity escape detection by guideline-recommended delirium and encephalopathy screening instruments in critically ill patients.
Design: Cross-sectional study with random patient sampling.
Setting: ICUs of a large referral hospital with protocols implementing the Society of Critical Care Medicine's ICU Liberation Bundle.
Biol Psychol
December 2024
Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China. Electronic address:
J Vis
December 2024
School of Psychological Science, University of Bristol, Bristol, UK.
Being able to detect changes in our visual environment reliably and quickly is important for many daily tasks. The motion silencing effect describes a decrease in the ability to detect feature changes for faster moving objects compared with stationary or slowly moving objects. One theory is that spatiotemporal receptive field properties in early vision might account for the silencing effect, suggesting that its origins are low-level visual processing.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!