The relative accessibility and simplicity of vestibular sensing and vestibular-driven control of head and eye movements has made the vestibular system an attractive subject to experimenters and theoreticians interested in developing realistic quantitative models of how brains gather and interpret sense data and use it to guide behavior. Head stabilization and eye counter-rotation driven by vestibular sensory input in response to rotational perturbations represent natural, ecologically important behaviors that can be reproduced in the laboratory and analyzed using relatively simple mathematical models. Models drawn from dynamical systems and control theory have previously been used to analyze the behavior of vestibular sensory neurons.
View Article and Find Full Text PDFRecent years have seen an explosion of interest in naturalistic behaviour and in machine learning tools for automatically tracking it. However, questions about what to measure, how to measure it, and how to relate naturalistic behaviour to neural activity and cognitive processes remain unresolved. In this Perspective, we propose a general experimental and computational framework - kinematic coding - for measuring how information about cognitive states is encoded in structured patterns of behaviour and how this information is read out by others during social interactions.
View Article and Find Full Text PDFObserving and voluntarily imitating the biological kinematics displayed by a model underpins the acquisition of new motor skills via sensorimotor processes linking perception with action. Differences in voluntary imitation in autism could be related to sensorimotor processing activity during action-observation of biological motion, as well as how sensorimotor integration processing occurs across imitation attempts. Using an observational practice protocol, which minimized the active contribution of the peripheral sensorimotor system, we examined the contribution of sensorimotor processing during action-observation.
View Article and Find Full Text PDFThe ability to anticipate what others will do next is crucial for navigating social, interactive environments. Here, we develop an experimental and analytical framework to measure the implicit readout of prospective intention information from movement kinematics. Using a primed action categorization task, we first demonstrate implicit access to intention information by establishing a novel form of priming, which we term kinematic priming: subtle differences in movement kinematics prime action prediction.
View Article and Find Full Text PDFWhy do we run toward people we love, but only walk toward others? One reason is to let them know we love them. In this commentary, we elaborate on how subjective utility information encoded in vigor is read out by others. We consider the potential implications for understanding and modeling the link between movements and decisions in social environments.
View Article and Find Full Text PDFAlthough it is well established that fronto-parietal regions are active during action observation, whether they play a causal role in the ability to infer others' intentions from visual kinematics remains undetermined. In the experiments reported here, we combined offline continuous theta burst stimulation (cTBS) with computational modeling to reveal and causally probe single-trial computations in the inferior parietal lobule (IPL) and inferior frontal gyrus (IFG). Participants received cTBS over the left anterior IPL and the left IFG pars orbitalis in separate sessions before completing an intention discrimination task (discriminate intention of observed reach-to-grasp acts) or a kinematic discrimination task unrelated to intention (discriminate peak wrist height of the same acts).
View Article and Find Full Text PDFProc Natl Acad Sci U S A
December 2009
It is generally accepted that young worker bees (Apis mellifera L.) are highly attracted to queen mandibular pheromone (QMP). Our results challenge this widely held view.
View Article and Find Full Text PDF