We investigated the effect of unseen hand posture on cross-modal, visuo-tactile links in covert spatial attention. In Experiment 1, a spatially nonpredictive visual cue was presented to the left or right hemifield shortly before a tactile target on either hand. To examine the spatial coordinates of any cross-modal cuing, the unseen hands were either uncrossed or crossed so that the left hand lay to the right and vice versa.
View Article and Find Full Text PDFCogn Affect Behav Neurosci
December 2001
In this study, we examined whether integration of visual and auditory information about emotions requires limited attentional resources. Subjects judged whether a voice expressed happiness or fear, while trying to ignore a concurrently presented static facial expression. As an additional task, the subjects had to add two numbers together rapidly (Experiment 1), count the occurrences of a target digit in a rapid serial visual presentation (Experiment 2), or judge the pitch of a tone as high or low (Experiment 3).
View Article and Find Full Text PDFNeurophysiological studies have demonstrated multisensory interaction effects in the neural structures involved in saccade generation when visual, auditory or somatosensory stimuli are presented bimodally. Visual-auditory interaction effects have been demonstrated in numerous behavioural studies of saccades but little is known about interaction effects involving somatosensory stimuli. The present study examined visual-somatosensory interaction effects on saccade generation using a multisensory paradigm, whereby task-irrelevant distractors appeared spatially-coincident with, or remote from the designated saccade target.
View Article and Find Full Text PDFWe examined the electrophysiological correlates of left-sided tactile extinction in a patient with right-hemisphere damage. Computer-controlled punctate touch was presented to the left, right or both index fingers in an unpredictable sequence. The patient reported his conscious tactile percept ("left", "right" or "both").
View Article and Find Full Text PDFPerception of movement in acoustic space depends on comparison of the sound waveforms reaching the two ears (binaural cues) as well as spectrotemporal analysis of the waveform at each ear (monaural cues). The relative importance of these two cues is different for perception of vertical or horizontal motion, with spectrotemporal analysis likely to be more important for perceiving vertical shifts. In humans, functional imaging studies have shown that sound movement in the horizontal plane activates brain areas distinct from the primary auditory cortex, in parietal and frontal lobes and in the planum temporale.
View Article and Find Full Text PDFThis study examined whether differential neural responses are evoked by emotional stimuli with and without conscious perception, in a patient with visual neglect and extinction. Stimuli were briefly shown in either right, left, or both fields during event-related fMRI. On bilateral trials, either a fearful or neutral left face appeared with a right house, and it could either be extinguished from awareness or perceived.
View Article and Find Full Text PDFVisual extinction after right parietal damage involves a loss of awareness for stimuli in the contralesional field when presented concurrently with ipsilesional stimuli, although contralesional stimuli are still perceived if presented alone. However, extinguished stimuli can still receive some residual on-line processing, without awareness. Here we examined whether such residual processing of extinguished stimuli can produce implicit and/or explicit memory traces lasting many minutes.
View Article and Find Full Text PDFIn mirror reflections, visual stimuli in near peripersonal space (e.g., an object in the hand) can project the retinal image of far, extrapersonal stimuli "beyond" the mirror.
View Article and Find Full Text PDFTwo experiments document that conceptual knowledge influences 3-year-olds' extension of novel words. In Experiment 1, when objects were described as having conceptual properties typical of artifacts, children extended novel labels for these objects on the basis of shape alone. When the very same objects were described as having conceptual properties typical of animate kinds, children extended novel labels for these objects on the basis of both shape and texture.
View Article and Find Full Text PDFRecent results indicate that crossmodal interactions can affect activity in cortical regions traditionally regarded as "unimodal." Previously we found that combining touch on one hand with visual stimulation in the anatomically corresponding hemifield could boost responses in contralateral visual cortex. Here we manipulated which visual hemifield corresponded to the location of the stimulated hand, by changing gaze direction such that right-hand touch could now arise in either the left or right visual field.
View Article and Find Full Text PDFEvent-related functional magnetic resonance imaging was used to identify brain areas involved in spatial attention and determine whether these operate unimodally or supramodally for vision and touch. On a trial-by-trial basis, a symbolic auditory cue indicated the most likely side for the subsequent target, thus directing covert attention to one side. A subsequent target appeared in vision or touch on the cued or uncued side.
View Article and Find Full Text PDFRecent behavioral and event-related brain potential (ERP) studies have revealed cross-modal interactions in endogenous spatial attention between vision and audition, plus vision and touch. The present ERP study investigated whether these interactions reflect supramodal attentional control mechanisms, and whether similar cross-modal interactions also exist between audition and touch. Participants directed attention to the side indicated by a cue to detect infrequent auditory or tactile targets at the cued side.
View Article and Find Full Text PDFWe conducted two event-related functional magnetic resonance imaging (fMRI) experiments to investigate the neural substrates of visual object recognition in humans. We used a repetition-priming method with visual stimuli recurring at unpredictable intervals, either with the same appearance or with changes in size, viewpoint or exemplar. Lateral occipital and posterior inferior temporal cortex showed lower activity for repetitions of both real and non-sense objects; fusiform and left inferior frontal regions showed decreases for repetitions of only real objects.
View Article and Find Full Text PDFPrevious work has found a left visual field (LVF) advantage for various judgements on faces, including identity and emotional expression. This has been related to possible right-hemisphere specialisation for face processing, and it has been proposed that this might reflect configural processing. We sought to determine whether a similar LVF advantage may also exist for gaze perception.
View Article and Find Full Text PDFDifferent sensory systems (e.g. proprioception and vision) have a combined influence on the perception of body orientation, but the timescale over which they can be integrated remains unknown.
View Article and Find Full Text PDFWe used positron emission tomography (PET) to investigate the neural correlates of selective attention in humans. We examined the effects of attending to one side of space versus another (spatial selection) and to one sensory modality versus another (intermodal selection) during bilateral, bimodal stimulation of vision and touch. Attention toward one side resulted in greater activity in several contralateral areas.
View Article and Find Full Text PDFIn a visual-tactile interference paradigm, subjects judged whether tactile vibrations arose on a finger or thumb (upper vs. lower locations), while ignoring distant visual distractor lights that also appeared in upper or lower locations. Incongruent visual distractors (e.
View Article and Find Full Text PDFBr J Psychol
February 2001
Research on attention is concerned with selective processing of incoming sensory information. To some extent, our awareness of the world depends on what we choose to attend, not merely on the stimulation entering our senses. British psychologists have made substantial contributions to this topic in the past century.
View Article and Find Full Text PDFPossible auditory deficits in neglect were examined by comparing the performance of four right brain-damaged (RBD) patients with left visuospatial neglect, versus four RBD patients without neglect, in three auditory tasks. The first task required speeded discrimination of sound elevation, by moving a central lever up or down according to the vertical position of a peripheral target sound, regardless of its side. The other two auditory tasks were non-spatial, requiring either speeded pitch discrimination (moving the central lever up for high pitch, down for low pitch) or speeded target detection.
View Article and Find Full Text PDFNeurosci Biobehav Rev
August 2001
The adaptive control of behaviour in response to relevant external objects and events often requires the selection of information delivered by different sensory systems, but from the same region in external space. This can be facilitated by crossmodal links in the attentional processing of information across sensory modalities. Results from recent event-related potential (ERP) studies are reviewed that investigated mechanisms underlying such crossmodal links in spatial attention between vision, audition and touch.
View Article and Find Full Text PDFWe examined the effects of chronic unilateral lesions to either the inferior parietal lobe, or to the dorsolateral prefrontal cortex including the frontal eye fields (FEFs), upon human visual perception and saccades in temporal-order-judgment (TOJ) tasks. Two visual events were presented on each trial, one in each hemifield at various stimulus onset asynchronies (SOAs). In the saccade task, patients moved their eyes to whichever stimulus attracted gaze first.
View Article and Find Full Text PDFNeuropsychologia
February 2002
In the present paper, we review several functional imaging studies investigating crossmodal interactions between vision and touch relating to spatial attention. We asked how the spatial unity of a multimodal event in the external world might be represented in the brain, where signals from different modalities are initially processed in distinct brain regions. The results highlight several links between visual and tactile spatial representations.
View Article and Find Full Text PDFSelective attention allows people to process some stimuli more thoroughly than others. This is partly under voluntary control, and partly determined by stimulus salience. Selective attention has been studied with psychological methods for many years, but recent cognitive neuroscience studies using brain-imaging methods (and other neurobiological measures) have transformed the topic.
View Article and Find Full Text PDFDetection of an oriented visual target can be facilitated by collinear visual flankers. Such lateral interactions are thought to reflect integrative processes in low-level vision. In past studies, the flankers were task-irrelevant, and were typically assumed to be unattended.
View Article and Find Full Text PDF