Front Robot AI
February 2024
Accurate texture classification empowers robots to improve their perception and comprehension of the environment, enabling informed decision-making and appropriate responses to diverse materials and surfaces. Still, there are challenges for texture classification regarding the vast amount of time series data generated from robots' sensors. For instance, robots are anticipated to leverage human feedback during interactions with the environment, particularly in cases of misclassification or uncertainty.
View Article and Find Full Text PDFWe present the open-source design and fabrication of a compliant multimodal tactile sensing module. The sensing module design presented here enables robotic end-effectors to sense contact properties like pressure and vibration and estimate a quaternion that represents the deformation due to contact. We elaborated the module's compliant structure fabrication process to use only 3D printed molds and a vacuum chamber, making it accessible to a broad range of roboticists.
View Article and Find Full Text PDFReproducing human-like dexterous manipulation in robots requires identifying objects and textures. In unstructured settings, robots equipped with tactile sensors may detect textures by using touch-related characteristics. An extensive dataset of the physical interaction between a tactile-enable robotic probe is required to investigate and develop methods for categorizing textures.
View Article and Find Full Text PDFDexterous robotic manipulation tasks depend on estimating the state of in-hand objects, particularly their orientation. Although cameras have been traditionally used to estimate the object's pose, tactile sensors have recently been studied due to their robustness against occlusions. This paper explores tactile data's temporal information for estimating the orientation of grasped objects.
View Article and Find Full Text PDFThe myoelectric prosthesis is a promising tool to restore the hand abilities of amputees, but the classification accuracy of surface electromyography (sEMG) is not high enough for real-time application. Researchers proposed integrating sEMG signals with another feature that is not affected by amputation. The strong coordination between vision and hand manipulation makes us consider including visual information in prosthetic hand control.
View Article and Find Full Text PDFUnderactuated hands are useful tools for robotic in-hand manipulation tasks due to their capability to seamlessly adapt to unknown objects. To enable robots using such hands to achieve and maintain stable grasping conditions even under external disturbances while keeping track of an in-hand object's state requires learning object-tactile sensing data relationships. The human somatosensory system combines visual and tactile sensing information in their "What and Where" subsystem to achieve high levels of manipulation skills.
View Article and Find Full Text PDF