Automatically Characterizing Sensory-Motor Patterns Underlying Reach-to-Grasp Movements on a Physical Depth Inversion Illusion.

Front Hum Neurosci

Graduate Program in Neuroscience, Rutgers UniversityPiscataway, NJ, USA; Department of Biomedical Engineering, Rutgers UniversityPiscataway, NJ, USA; Center for Cognitive Science, Rutgers UniversityPiscataway, NJ, USA; Department of Psychology, Rutgers UniversityPiscataway, NJ, USA; Department of Computer Science, Rutgers UniversityPiscataway, NJ, USA.

Published: January 2016

Recently, movement variability has been of great interest to motor control physiologists as it constitutes a physical, quantifiable form of sensory feedback to aid in planning, updating, and executing complex actions. In marked contrast, the psychological and psychiatric arenas mainly rely on verbal descriptions and interpretations of behavior via observation. Consequently, a large gap exists between the body's manifestations of mental states and their descriptions, creating a disembodied approach in the psychological and neural sciences: contributions of the peripheral nervous system to central control, executive functions, and decision-making processes are poorly understood. How do we shift from a psychological, theorizing approach to characterize complex behaviors more objectively? We introduce a novel, objective, statistical framework, and visuomotor control paradigm to help characterize the stochastic signatures of minute fluctuations in overt movements during a visuomotor task. We also quantify a new class of covert movements that spontaneously occur without instruction. These are largely beneath awareness, but inevitably present in all behaviors. The inclusion of these motions in our analyses introduces a new paradigm in sensory-motor integration. As it turns out, these movements, often overlooked as motor noise, contain valuable information that contributes to the emergence of different kinesthetic percepts. We apply these new methods to help better understand perception-action loops. To investigate how perceptual inputs affect reach behavior, we use a depth inversion illusion (DII): the same physical stimulus produces two distinct depth percepts that are nearly orthogonal, enabling a robust comparison of competing percepts. We find that the moment-by-moment empirically estimated motor output variability can inform us of the participants' perceptual states, detecting physiologically relevant signals from the peripheral nervous system that reveal internal mental states evoked by the bi-stable illusion. Our work proposes a new statistical platform to objectively separate changes in visual perception by quantifying the unfolding of movement, emphasizing the importance of including in the motion analyses all overt and covert aspects of motor behavior.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4700265PMC
http://dx.doi.org/10.3389/fnhum.2015.00694DOI Listing

Publication Analysis

Top Keywords

depth inversion
8
inversion illusion
8
mental states
8
peripheral nervous
8
nervous system
8
automatically characterizing
4
characterizing sensory-motor
4
sensory-motor patterns
4
patterns underlying
4
underlying reach-to-grasp
4

Similar Publications

Objective: The objective of this study was to determine the outcomes of robotic peritoneal flap vaginoplasty.

Background: There is a lack of long-term outcomes data for gender-affirming vaginoplasty to inform patient decision-making.

Methods: A retrospective cohort of 500 consecutive patients undergoing robotic peritoneal flap vaginoplasty from 2017-2023 were reviewed.

View Article and Find Full Text PDF

In the eastern segment of the Central Asian Orogenic Belt (CAOB), there is widespread volcanic magma activity. However, there is still considerable controversy over the formation mechanisms and material sources of these volcanoes. The mantle transition zone (MTZ), as a necessary channel for the upward and downward movement of mantle material and energy exchange may provide crucial constraints on the dynamic mechanisms of volcanic activity.

View Article and Find Full Text PDF

Guided wave tomography of pipe bends based on full waveform inversion.

Ultrasonics

January 2025

Department of Civil Engineering and Architecture, Tallinn University of Technology, Ehitajate tee 5, 19086, Tallinn, Estonia. Electronic address:

Pipe bends are recognized as critical areas susceptible to wall thinning, a phenomenon instigated by abrupt changes in the fluid flow direction and velocity. Conventional monitoring techniques for bends typically depend on localized ultrasonic measurements of thickness. While these methods are effective, they can be time-consuming compared to the use of permanently installed transducers, a strategy employed in guided wave tomography (GWT).

View Article and Find Full Text PDF

Bathymetry critically influences the intrusion of warm Circumpolar Deep Water onto the continental shelf and under ice shelf cavities in Antarctica, thereby forcing ice melting, grounding line retreat, and sea level rise. We present a novel and comprehensive bathymetry of Antarctica that includes all ice shelf cavities and previously unmeasured continental shelf areas. The new bathymetry is based on a 3D inversion of a circumpolar compilation of gravity anomalies constrained by measurements from the International Bathymetric Chart of the Southern Ocean, BedMachine Antarctica, and discrete seafloor measurements from seismic and ocean robotic probes.

View Article and Find Full Text PDF

Globally, heavy metal (HM) soil pollution is becoming an increasingly serious concern. Heavy metals in soils pose significant environmental and health risks due to their persistence, toxicity, and potential for bioaccumulation. These metals often originate from anthropogenic activities such as industrial emissions, agricultural practices, and improper waste disposal.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!