Microanalysis of video from a robotic surgical procedure: implications for observational learning in the robotic environment.

J Robot Surg

Department of Surgery, University of California, San Francisco, 513 Parnassus Avenue, S-321, San Francisco, CA, 94143-0470, USA.

Published: June 2019

Without haptic feedback, robotic surgeons rely on visual processing to interpret the operative field. To provide guidance for teaching in this environment, we analyzed intracorporeal actions and behaviors of a robotic surgeon. Six hours of video were captured by the intracorporeal camera during a robot-assisted lower anterior resection. After complete review, authors reduced the video to a consecutive 35 min of highly focused robotic activity and finally, a 2-min clip was subjected to microanalysis. The clip was replayed multiple times (capturing 1, 2, 10, 60 and 120 s intervals) and activities were identified, such as right and left hand motion, tissue handling and camera adjustments recorded using a software program. Activity patterns were categorized into two main themes: change in operative focus occurs when there is an inability to obtain adequate tension, and observation of robot-assisted surgery is based on an incomplete visual framework. The surgeon manipulated tissue predominantly using blunt adjustments and rarely grasped it, likely as a way to avoid tissue trauma. A magnified operative field required precise dissection, which occurs robotically with movement of a single instrument against a static field (motionless second robotic arm). This meticulous technique is unlike the bimodal manipulation often used for laparoscopic dissection. Since residents have limited active participation in robotic cases, and therefore, rely heavily on the captured image for skill acquisition, we recommend surgeons to use focus shifts as an opportunity to describe their operative decision-making and highlight instrument manipulations specific to operating with robotic technology.

Download full-text PDF

Source
http://dx.doi.org/10.1007/s11701-018-0866-6DOI Listing

Publication Analysis

Top Keywords

robotic
8
operative field
8
microanalysis video
4
video robotic
4
robotic surgical
4
surgical procedure
4
procedure implications
4
implications observational
4
observational learning
4
learning robotic
4

Similar Publications

Initial experience of a novel surgical assist robot "Saroa" featuring tactile feedback and a roll-clutch system in radical prostatectomy.

Sci Rep

December 2024

Department of Urology, The Jikei University School of Medicine, Kashiwa Hospital, Kashiwashita 163-1, Kashiwa, Chiba, 277-8567, Japan.

To evaluate the safety and efficacy of the Saroa Surgical Robot System in robot-assisted laparoscopic radical prostatectomy (RARP). We enrolled 60 patients who underwent RARP using either the Saroa (n = 9) or da Vinci Xi (n = 51) systems at Jikei University Kashiwa Hospital from January 2022 to March 2024. We compared preoperative characteristics, perioperative outcomes, complications, and postoperative urinary continence at three months between the two groups.

View Article and Find Full Text PDF

An fMRI study on the generalization of motor learning after brain actuated supernumerary robot training.

NPJ Sci Learn

December 2024

Academy of Medical Engineering and Translational Medicine (AMT), Tianjin University, Tianjin, China.

Generalization is central to motor learning. However, few studies are on the learning generalization of BCI-actuated supernumerary robotic finger (BCI-SRF) for human-machine interaction training, and no studies have explored its longitudinal neuroplasticity mechanisms. Here, 20 healthy right-handed participants were recruited and randomly assigned to BCI-SRF group or inborn finger group (Finger) for 4-week training and measured by novel SRF-finger opposition sequences and multimodal MRI.

View Article and Find Full Text PDF

Imagine going left versus imagine going right: whole-body motion on the lateral axis.

Sci Rep

December 2024

Creative Robotics Lab, UNSW, Sydney, 2021, Australia.

Unlike the conventional, embodied, and embrained whole-body movements in the sagittal forward and vertical axes, movements in the lateral/transversal axis cannot be unequivocally grounded, embodied, or embrained. When considering motor imagery for left and right directions, it is  assumed that participants have underdeveloped representations due to a lack of familiarity with moving along the lateral axis. In the current study, a 32 electroencephalography (EEG) system was used to identify the oscillatory neural signature linked with lateral axis motor imagery.

View Article and Find Full Text PDF

Bioinspired origami-based soft prosthetic knees.

Nat Commun

December 2024

Department of Advanced Manufacturing and Robotics, College of Engineering, Peking University, Beijing, China.

Prosthetic knees represent a prevalent solution for above-knee amputation rehabilitation. However, satisfying the ambulation requirements of users while achieving their comfort needs in terms of lightweight, bionic, shock-absorbing, and user-centric, remains out of reach. Soft materials seem to provide alternative solutions as their properties are conducive to the comfort aspect.

View Article and Find Full Text PDF

Small-scale continuum robots hold promise for interventional diagnosis and treatment, yet existing models struggle to achieve small size, precise steering, and visualized functional treatment simultaneously, termed an "impossible trinity". This study introduces an optical fiber-based continuum robot integrated imaging, high-precision motion, and multifunctional operation abilities at submillimeter-scale. With a slim profile of 0.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!