While all minimally invasive procedures involve navigating from a small incision in the skin to the site of the intervention, it has not been previously demonstrated how this can be done autonomously. To show that autonomous navigation is possible, we investigated it in the hardest place to do it - inside the beating heart. We created a robotic catheter that can navigate through the blood-filled heart using wall-following algorithms inspired by positively thigmotactic animals. The catheter employs haptic vision, a hybrid sense using imaging for both touch-based surface identification and force sensing, to accomplish wall following inside the blood-filled heart. Through in vivo animal experiments, we demonstrate that the performance of an autonomously-controlled robotic catheter rivals that of an experienced clinician. Autonomous navigation is a fundamental capability on which more sophisticated levels of autonomy can be built, e.g., to perform a procedure. Similar to the role of automation in fighter aircraft, such capabilities can free the clinician to focus on the most critical aspects of the procedure while providing precise and repeatable tool motions independent of operator experience and fatigue.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6693882 | PMC |
http://dx.doi.org/10.1126/scirobotics.aaw1977 | DOI Listing |
Despite the richness of the human tactile capacity, remote communication practices often lack touch-based interactions. This leads to overtaxing our visual and auditory channels, a lack of connection and engagement, and inaccessibility for diverse sensory groups. In this paper, we learn from haptic intuitions of the blind and low vision (BLV) and Protactile DeafBlind (PT-DB) communities to investigate how core functions of communication can be routed through tactile channels.
View Article and Find Full Text PDFIncreasing interest surrounds the use of robotic and computer technologies for precise endovascular interventions. However, a limitation in current robotic procedures is the reliance on 2D fluoroscopy for surgical navigation and lack of haptic guidance. In addressing this, we present an improved guidance framework for CathBot, our MR-compatible endovascular robot.
View Article and Find Full Text PDFHumans operating in dynamic environments with limited visibility are susceptible to collisions with moving objects, occupational hazards, and/or other agents, which can result in personal injuries or fatalities. Most existing research has focused on using vibrotactile cues to address this challenge. In this work, we propose a fundamentally new approach that utilizes variable impedance on an active exoskeleton to guide humans away from hazards and towards safe areas.
View Article and Find Full Text PDFIEEE Trans Neural Syst Rehabil Eng
January 2025
In recent years, various haptic rendering methods have been proposed to help people obtain interactive experiences with virtual textures through vibration feedback. However, due to impaired vision, the blind or visually impaired (BVI) is still unable to effectively perceive and learn virtual textures through these methods. To help BVIs have the opportunity to improve their object cognition by learning virtual textures, we built a virtual texture learning system based on multimodal feedback.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!