Previous studies on mental rotation (i.e., the ability to imagine objects undergoing rotation; MR) have mainly focused on visual input, with comparatively less information about tactile input. In this study, we examined whether the processes subtending MR of 3D stimuli with both input modalities are perceptually equivalent (i.e., when learning within-modalities is equal to transfers-of-learning between modalities). We compared participants' performances in two consecutive task sessions either in no-switch conditions (Visual→Visual or Tactile→Tactile) or in switch conditions (Visual→Tactile or Tactile→Visual). Across both task sessions, we observed MR response differences with visual and tactile inputs, as well as difficult transfer-of-learning. In no-switch conditions, participants showed significant improvements on all dependent measures. In switch conditions, however, we only observed significant improvements in response speeds with tactile input (RTs, intercepts, slopes: Visual→Tactile) and close to significant improvement in response accuracy with visual input (Tactile→Visual). Model fit analyses (of the rotation angle effect on RTs) also suggested different specification in learning with tactile and visual input. In "Session 1", the RTs fitted similarly well to the rotation angles, for both types of perceptual responses. However, in "Session 2", trend lines in the fitting analyses changed in a stark way, in the switch and tactile no-switch conditions. These results suggest that MR with 3D objects is not necessarily a perceptually equivalent process. Specialization (and priming) in the exploration strategies (i.e., speed-accuracy trade-offs) might, however, be the main factor at play in these results-and not MR differences in and of themselves.

Download full-text PDF

Source
http://dx.doi.org/10.1007/s00221-018-5172-zDOI Listing

Publication Analysis

Top Keywords

tactile input
12
visual input
12
no-switch conditions
12
mental rotation
8
visual tactile
8
perceptually equivalent
8
task sessions
8
switch conditions
8
input
7
tactile
6

Similar Publications

Previous studies have shown that high-gamma (HG) activity in the primary visual cortex (V1) has distinct higher (broadband) and lower (narrowband) components with different functions and origins. However, it is unclear whether a similar segregation exists in the primary somatosensory cortex (S1), and the origins and roles of HG activity in S1 remain unknown. Here, we investigate the functional roles and origins of HG activity in S1 during tactile stimulation in humans and a rat model.

View Article and Find Full Text PDF

: Tactile gnosis derives from the interplay between the hand's tactile input and the memory systems of the brain. It is the prerequisite for complex hand functions. Impaired sensation leads to profound disability.

View Article and Find Full Text PDF

Introduction: To interact with the environment, it is crucial to distinguish between sensory information that is externally generated and inputs that are self-generated. The sensory consequences of one's own movements tend to induce attenuated behavioral- and neural responses compared to externally generated inputs. We propose a computational model of sensory attenuation (SA) based on Bayesian Causal Inference, where SA occurs when an internal cause for sensory information is inferred.

View Article and Find Full Text PDF

An origami-based tactile sensory ring utilizing multilayered conductive paper substrates presents an innovative approach to wearable health applications. By harnessing paper's flexibility and employing origami folding, the sensors integrate structural stability and self-packaging without added encapsulation layers. Knot-shaped designs create loop-based systems that secure conductive paper strips and protect sensing layers.

View Article and Find Full Text PDF

Multi-gate neuron-like transistors based on ensembles of aligned nanowires on flexible substrates.

Nano Converg

January 2025

Bendable Electronics and Sustainable Technologies (BEST) Group, Electrical and Computer Engineering Department, Northeastern University, Boston, MA, 02115, USA.

The intriguing way the receptors in biological skin encode the tactile data has inspired the development of electronic skins (e-skin) with brain-inspired or neuromorphic computing. Starting with local (near sensor) data processing, there is an inherent mechanism in play that helps to scale down the data. This is particularly attractive when one considers the huge data produced by large number of sensors expected in a large area e-skin such as the whole-body skin of a robot.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!