Information about the shape and spatial orientation of an object can be gathered during exploratory hand and arm movements, and then must be synthesized into a unified percept. During the robotically guided exploration of virtual polygons or triangles, the perception of the lengths of two adjoining segments is not always geometrically consistent with the perception of the internal angles between these segments. The present study further characterized this established inconsistency, and also found that subjects' internal angle judgments were influenced by the spatial orientations of the segments, especially the segment that was explored last in the sequence. Internal angle judgments were also biased by the subjects' own active forces, applied in the direction perpendicular to the programmed handle motion. For the last segment, but not for the earlier segments, subjects produced more outward force when they reported larger angles and more inward force when they reported smaller angles. Thus, the haptic synthesis of object shape is influenced by multiple geometric, spatial, and self-produced factors.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3693567 | PMC |
http://dx.doi.org/10.1109/TOH.2011.18 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!