Publications by authors named "Irene A Kuling"

Soft pneumatic displays have shown to provide compelling soft haptic feedback. However, they have rarely been tested in Virtual Reality applications, while we are interested in their potential for haptic feedback in the metaverse. Therefore, we designed a fully soft Pneumatic Unit Cell (PUC) and implemented it in a VR button task, in which users could directly use their hands for interaction.

View Article and Find Full Text PDF

Soft robots are interesting examples of hyper-redundancy in robotics. However, the nonlinear continuous dynamics of these robots and the use of hyper-elastic and visco-elastic materials make modeling these robots more complicated. This study presents a geometric inverse kinematics (IK) model for trajectory tracking of multi-segment extensible soft robots, where each segment of the soft actuator is geometrically approximated with a rigid links model to reduce the complexity.

View Article and Find Full Text PDF

Telerobotics aims to transfer human manipulation skills and dexterity over an arbitrary distance and at an arbitrary scale to a remote workplace. A telerobotic system that is transparent enables a natural and intuitive interaction. We postulate that embodiment (with three sub-components: sense of ownership, agency, and self-location) of the robotic system leads to optimal perceptual transparency and increases task performance.

View Article and Find Full Text PDF

People often look at objects that they are holding in their hands. It is therefore reasonable to expect them to be able to direct their gaze precisely with respect to their fingers. However, we know that people make reproducible idiosyncratic errors of up to a few centimetres when they try to align a visible cursor to their own finger hidden below a surface.

View Article and Find Full Text PDF

Grip force has been studied widely in a variety of interaction and movement tasks, however, not much is known about the timing of the grip force control in preparation for interaction with objects. For example, it is unknown whether and how the temporal preparation for a collision is related to (the prediction of) the impact load. To study this question, we examined the anticipative timing of the grip force in preparation for impact loads.

View Article and Find Full Text PDF

When lifting an object, it takes time to decide how heavy it is. How does this weight judgment develop? To answer this question, we examined when visual size information has to be present to induce a size-weight illusion. We found that a short glimpse (200 ms) of size information is sufficient to induce a size-weight illusion.

View Article and Find Full Text PDF

The proprioceptive sense provides somatosensory information about positions of parts of the body, information that is essential for guiding behavior and monitoring the body. Few studies have investigated the perceptual localization of individual fingers, despite their importance for tactile exploration and fine manipulation. We present two experiments assessing the performance of proprioceptive localization of multiple fingers, either alone or in combination with visual cues.

View Article and Find Full Text PDF

When asked to move their unseen hand-to-visual targets, people exhibit idiosyncratic but reliable visuo-proprioceptive matching errors. Unsurprisingly, vision and proprioception quickly align when these errors are made apparent by providing visual feedback of the position of the hand. However, retention of this learning is limited, such that the original matching errors soon reappear when visual feedback is removed.

View Article and Find Full Text PDF

People make systematic errors when matching the location of an unseen index finger with that of a visual target. These errors are consistent over time, but idiosyncratic and surprisingly task-specific. The errors that are made when moving the unseen index finger to a visual target are not consistent with the errors when moving a visual target to the unseen index finger.

View Article and Find Full Text PDF

It has been proposed that haptic spatial perception depends on one's visual abilities. We tested spatial perception in the workspace using a combination of haptic matching and line drawing tasks. There were 132 participants with varying degrees of visual ability ranging from congenitally blind to normally sighted.

View Article and Find Full Text PDF

Cutaneous information has been shown to influence proprioceptive position sense when subjects had to judge or match the posture of their limbs. In the present study, we tested whether cutaneous information also affects proprioceptive localization of the hand when moving it to a target. In an explorative study, we manipulated the skin stretch around the elbow by attaching elastic sports tape to one side of the arm.

View Article and Find Full Text PDF

People make systematic errors when matching locations of an unseen index finger with the index finger of the other hand, or with a visual target. In this study, we present two experiments that test the consistency of such matching errors across different combinations of matching methods. In the first experiment, subjects had to move their unseen index fingers to visually presented targets.

View Article and Find Full Text PDF

Visuo-haptic biases are observed when bringing your unseen hand to a visual target. The biases are different between, but consistent within participants. We investigated the usefulness of adjusting haptic guidance to these user-specific biases in aligning haptic and visual perception.

View Article and Find Full Text PDF

People make systematic errors when they move their unseen dominant hand to a visual target (visuo-haptic matching) or to their other unseen hand (haptic-haptic matching). Why they make such errors is still unknown. A key question in determining the reason is to what extent individual participants' errors are stable over time.

View Article and Find Full Text PDF

Humans make both random and systematic errors when reproducing learned movements. Intuitive haptic guidance that assists one to make the movements reduces such errors. Our study examined whether any additional haptic information about the location of the target reduces errors in a position reproduction task, or whether the haptic guidance needs to be assistive to do so.

View Article and Find Full Text PDF

In an admittance-controlled haptic device, input forces are used to calculate the movement of the device. Although developers try to minimize delays, there will always be delays between the applied force and the corresponding movement in such systems, which might affect what the user of the device perceives. In this experiment we tested whether these delays in a haptic human-robot interaction influence the perception of mass.

View Article and Find Full Text PDF

Information from cutaneous, muscle and joint receptors is combined with efferent information to create a reliable percept of the configuration of our body (proprioception). We exposed the hand to several horizontal force fields to examine whether external forces influence this percept. In an end-point task subjects reached visually presented positions with their unseen hand.

View Article and Find Full Text PDF

The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from rhythm perception. In this method, participants had to align the temporal position of a target in a rhythmic sequence of four markers.

View Article and Find Full Text PDF