We present experimental and computational evidence for the estimation of visual and proprioceptive directional information during forward, visually driven arm movements. We presented noisy directional proprioceptive and visual stimuli simultaneously and in isolation midway during a pointing movement. Directional proprioceptive stimuli were created by brief force pulses, which varied in direction and were applied to the fingertip shortly after movement onset. Subjects indicated the perceived direction of the stimulus after each trial. We measured unimodal performance in trials in which we presented only the visual or only the proprioceptive stimulus. When we presented simultaneous but conflicting bimodal information, subjects' perceived direction fell in between the visual and proprioceptive directions. We find that the judged mean orientation matched the MLE predictions but did not show the expected improvement in reliability as compared to unimodal performance. We present an alternative model (probabilistic cue switching, PCS), which is consistent with our data. According to this model, subjects base their bimodal judgments on only one of two directional cues in a given trial, with relative choice probabilities proportional to the average stimulus reliability. These results suggest that subjects based their decision on a probability mixture of both modalities without integrating information across modalities.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1167/9.5.28 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!