In traditional theories of perceptual learning, sensory modalities support one another. A good example comes from research on dynamic touch, the wielding of an unseen object to perceive its properties. Wielding provides the haptic system with mechanical information related to the length of the object. Visual feedback can improve the accuracy of subsequent length judgments; visual perception supports haptic perception. Such cross-modal support is not the only route to perceptual learning. We present a dynamic touch task in which we replaced visual feedback with the instruction to strike the unseen object against an unseen surface following length judgment. This additional mechanical information improved subsequent length judgments. We propose a self-organizing perspective in which a single modality trains itself.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.3758/APP.71.8.1717 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!