The aim of "Precision Surgery" is to reduce the impact of surgeries on patients' global health. In this context, over the last years, the use of three-dimensional virtual models (3DVMs) of organs has allowed for intraoperative guidance, showing hidden anatomical targets, thus limiting healthy-tissue dissections and subsequent damage during an operation. In order to provide an automatic 3DVM overlapping in the surgical field, we developed and tested a new software, called "ikidney", based on convolutional neural networks (CNNs).
View Article and Find Full Text PDFObjectives: The research's purpose is to develop a software that automatically integrates and overlay 3D virtual models of kidneys harboring renal masses into the Da Vinci robotic console, assisting surgeon during the intervention.
Introduction: Precision medicine, especially in the field of minimally-invasive partial nephrectomy, aims to use 3D virtual models as a guidance for augmented reality robotic procedures. However, the co-registration process of the virtual images over the real operative field is performed manually.
Background: Addressing intraoperative bleeding remains a significant challenge in the field of robotic surgery. This research endeavors to pioneer a groundbreaking solution utilizing convolutional neural networks (CNNs). The objective is to establish a system capable of forecasting instances of intraoperative bleeding during robot-assisted radical prostatectomy (RARP) and promptly notify the surgeon about bleeding risks.
View Article and Find Full Text PDFObjective: To evaluate the accuracy of our new three-dimensional (3D) automatic augmented reality (AAR) system guided by artificial intelligence in the identification of tumour's location at the level of the preserved neurovascular bundle (NVB) at the end of the extirpative phase of nerve-sparing robot-assisted radical prostatectomy.
Methods: In this prospective study, we enrolled patients with prostate cancer (clinical stages cT1c-3, cN0, and cM0) with a positive index lesion at target biopsy, suspicious for capsular contact or extracapsular extension at preoperative multiparametric magnetic resonance imaging. Patients underwent robot-assisted radical prostatectomy at San Luigi Gonzaga Hospital (Orbassano, Turin, Italy), from December 2020 to December 2021.
More than ever, precision surgery is making its way into modern surgery for functional organ preservation. This is possible mainly due to the increasing number of technologies available, including 3D models, virtual reality, augmented reality, and artificial intelligence. Intraoperative surgical navigation represents an interesting application of these technologies, allowing to understand in detail the surgical anatomy, planning a patient-tailored approach.
View Article and Find Full Text PDFPurpose: To evaluate the role of 3D models on positive surgical margin rate (PSM) rate in patients who underwent robot-assisted radical prostatectomy (RARP) compared to a no-3D control group. Secondarily, we evaluated the postoperative functional and oncological outcomes.
Methods: Prospective study enrolling patients with localized prostate cancer (PCa) undergoing RARP with mp-MRI-based 3D model reconstruction, displayed in a cognitive or augmented-reality fashion, at our Centre from 01/2016 to 01/2020.
Introduction: The current study presents a deep learning framework to determine, in real-time, position and rotation of a target organ from an endoscopic video. These inferred data are used to overlay the 3D model of patient's organ over its real counterpart. The resulting augmented video flow is streamed back to the surgeon as a support during laparoscopic robot-assisted procedures.
View Article and Find Full Text PDFAugmented reality robot-assisted partial nephrectomy (AR-RAPN) is limited by the need of a constant manual overlapping of the hyper-accuracy 3D (HA3D) virtual models to the real anatomy. To present our preliminary experience with automatic 3D virtual model overlapping during AR-RAPN. To reach a fully automated HA3D model overlapping, we pursued computer vision strategies, based on the identification of landmarks to link the virtual model.
View Article and Find Full Text PDFInt J Comput Assist Radiol Surg
September 2021
Purpose: The current study aimed to propose a Deep Learning (DL) and Augmented Reality (AR) based solution for a in-vivo robot-assisted radical prostatectomy (RARP), to improve the precision of a published work from our group. We implemented a two-steps automatic system to align a 3D virtual ad-hoc model of a patient's organ with its 2D endoscopic image, to assist surgeons during the procedure.
Methods: This approach was carried out using a Convolutional Neural Network (CNN) based structure for semantic segmentation and a subsequent elaboration of the obtained output, which produced the needed parameters for attaching the 3D model.
Purpose: The current study aimed to systematically review the literature addressing the use of deep learning (DL) methods in intraoperative surgery applications, focusing on the data collection, the objectives of these tools and, more technically, the DL-based paradigms utilized.
Methods: A literature search with classic databases was performed: we identified, with the use of specific keywords, a total of 996 papers. Among them, we selected 52 for effective analysis, focusing on articles published after January 2015.
Comput Methods Programs Biomed
July 2020
Background And Objective: We present an original approach to the development of augmented reality (AR) real-time solutions for robotic surgery navigation. The surgeon operating the robotic system through a console and a visor experiences reduced awareness of the operatory scene. In order to improve the surgeon's spatial perception during robot-assisted minimally invasive procedures, we provide him/her with a solid automatic software system to position, rotate and scale in real-time the 3D virtual model of a patient's organ aligned over its image captured by the endoscope.
View Article and Find Full Text PDFBackground: Despite technical improvements introduced with robotic surgery, management of complex tumours (PADUA score ≥10) is still a matter of debate within the field of transperitoneal robot-assisted partial nephrectomy (RAPN).
Objective: To evaluate the accuracy of our three-dimensional (3D) static and elastic augmented reality (AR) systems based on hyperaccuracy models (HA3D) in identifying tumours and intrarenal structures during transperitoneal RAPN (AR-RAPN), compared with standard ultrasound (US).
Design, Setting, And Participants: A retrospective study was conducted, including 91 patients who underwent RAPN for complex renal tumours, 48 with 3D AR guidance and 43 with 2D US guidance, from July 2017 to May 2019.
Introduction: As we enter the era of "big data," an increasing amount of complex health-care data will become available. These data are often redundant, "noisy," and characterized by wide variability. In order to offer a precise and transversal view of a clinical scenario the artificial intelligence (AI) with machine learning (ML) algorithms and Artificial neuron networks (ANNs) process were adopted, with a promising wide diffusion in the near future.
View Article and Find Full Text PDFBackground: 3D reconstructions are gaining a wide diffusion in nephron-sparing surgery (NSS) planning. They have usually been studied on common 2D flat supports, with limitations regarding real depth comprehension and interaction. Nowadays, it is possible to visualize kidney 3D reconstructions as holograms in a "mixed reality" (MR) setting.
View Article and Find Full Text PDFContext: Despite the current era of precision surgery in robotics, an unmet need still remains for optimal surgical planning and navigation for most genitourinary diseases. 3D virtual reconstruction of 2D cross-sectional imaging has been increasingly adopted to help surgeons better understand the surgical anatomy.
Objectives: To provide a short overview of the most recent evidence on current applications of 3D imaging in robotic urologic surgery.
Eur Urol
October 2019
Background: In prostate cancer (PCa) surgical procedures, in order to maximize potency recovery, a nerve-sparing (NS) procedure is preferred. However, cancer abutting or focally extending beyond the prostate capsule increases the risk of a positive surgical margin.
Objective: To evaluate the accuracy of our new three-dimensional (3D) elastic augmented-reality (AR) system in identifying capsular involvement (CI) location of PCa during the NS phase of robot-assisted radical prostatectomy (RARP).
Objectives: To assess the use of hyper-accuracy three-dimensional (HA3D™; MEDICS, Moncalieri, Turin, Italy) reconstruction based on multiparametric magnetic resonance imaging (mpMRI) and superimposed imaging during augmented-reality robot-assisted radical prostatectomy (AR-RARP).
Patients And Methods: Patients with prostate cancer (clinical stages cT1-3, cN0, cM0) undergoing RARP at our Centre, from June 2017 to April 2018, were enrolled. In all cases, cancer was diagnosed with targeted biopsy at the level of index lesion based on high-resolution (1-mm slices) mpMRI.