Purpose: Surgical treatments for low-rectal cancer require careful considerations due to the low location of cancer in rectums. Successful surgical outcomes highly depend on surgeons' ability to determine clear distal margins of rectal tumors. This is a challenge for surgeons in robot-assisted laparoscopic surgery, since tumors are often concealed in rectums and robotic surgical instruments do not provide tactile feedback for tissue diagnosis in real time. This paper presents the development and evaluation of an intraoperative ultrasound-based augmented reality framework for surgical guidance in robot-assisted rectal surgery.
Methods: Framework implementation consists in calibrating the transrectal ultrasound (TRUS) and the endoscopic camera (hand-eye calibration), generating a virtual model and registering it to the endoscopic image via optical tracking, and displaying the augmented view on a head-mounted display. An experimental validation setup is designed to evaluate the framework.
Results: The evaluation process yields a mean error of 0.9 mm for the TRUS calibration, a maximum error of 0.51 mm for the hand-eye calibration of endoscopic cameras, and a maximum RMS error of 0.8 mm for the whole framework. In the experiment with a rectum phantom, our framework guides the surgeon to accurately localize the simulated tumor and the distal resection margin.
Conclusions: This framework is developed with our clinical partner, based on actual clinical conditions. The experimental protocol and the high level of accuracy show the feasibility of seamlessly integrating this framework within the surgical workflow.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/s11548-019-02100-2 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!