How features of complex visual patterns are combined to drive perception and eye movements is not well understood. Here we simultaneously assessed human observers' perceptual direction estimates and ocular following responses (OFR) evoked by moving plaids made from two summed gratings with varying contrast ratios. When the gratings were of equal contrast, observers' eye movements and perceptual reports followed the motion of the plaid pattern.
View Article and Find Full Text PDFThe sudden onset of a visual object or event elicits an inhibition of eye movements at latencies approaching the minimum delay of visuomotor conductance in the brain. Typically, information presented via multiple sensory modalities, such as sound and vision, evokes stronger and more robust responses than unisensory information. Whether and how multisensory information affects ultra-short latency oculomotor inhibition is unknown.
View Article and Find Full Text PDFHow features of complex visual patterns combine to drive perception and eye movements is not well understood. We simultaneously assessed human observers' perceptual direction estimates and ocular following responses (OFR) evoked by moving plaids made from two summed gratings with varying contrast ratios. When the gratings were of equal contrast, observers' eye movements and perceptual reports followed the motion of the plaid pattern.
View Article and Find Full Text PDFNatural movements, such as catching a ball or capturing prey, typically involve multiple senses. Yet, laboratory studies on human movements commonly focus solely on vision and ignore sound. Here, we ask how visual and auditory signals are integrated to guide interceptive movements.
View Article and Find Full Text PDFObjects in our visual environment often move unpredictably and can suddenly speed up or slow down. The ability to account for acceleration when interacting with moving objects can be critical for survival. Here, we investigate how human observers track an accelerating target with their eyes and predict its time of reappearance after a temporal occlusion by making an interceptive hand movement.
View Article and Find Full Text PDFVisual working memory (VWM) is typically found to be severely limited in capacity, but this limitation may be ameliorated by providing familiar objects that are associated with knowledge stored in long-term memory. However, comparing meaningful and meaningless stimuli usually entails a confound, because different types of objects also tend to vary in terms of their inherent perceptual complexity. The current study therefore aimed to dissociate stimulus complexity from object meaning in VWM.
View Article and Find Full Text PDFWhen we catch a moving object in mid-flight, our eyes and hands are directed toward the object. Yet, the functional role of eye movements in guiding interceptive hand movements is not yet well understood. This review synthesizes emergent views on the importance of eye movements during manual interception with an emphasis on laboratory studies published since 2015.
View Article and Find Full Text PDFAttention shifts that precede goal-directed eye and hand movements are regarded as markers of motor target selection. Whether effectors compete for a single, shared attentional resource during simultaneous eye-hand movements or whether attentional resources can be allocated independently towards multiple target locations is controversially debated. Independent, effector-specific target selection mechanisms underlying parallel allocation of visuospatial attention to saccade and reach targets would predict an increase of the overall attention capacity with the number of active effectors.
View Article and Find Full Text PDFIn situations requiring immediate action, humans can generate visually-guided responses at remarkably short latencies. Here, to better understand the visual attributes that best evoke such rapid responses, we recorded upper limb muscle activity while participants performed visually-guided reaches towards Gabor patches composed of differing spatial frequencies (SFs). We studied reaches initiated from a stable posture (experiment 1, a static condition), or during on-line reach corrections to an abruptly displaced target (experiment 2, a dynamic condition).
View Article and Find Full Text PDFIn our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements.
View Article and Find Full Text PDF