Objective: This study aims to assess the statistical significance of training parameters in 240 dense UNets (DUNets) used for enhancing low Signal-to-Noise Ratio (SNR) and undersampled MRI in various acquisition protocols. The objective is to determine the validity of differences between different DUNet configurations and their impact on image quality metrics.
Materials And Methods: To achieve this, we trained all DUNets using the same learning rate and number of epochs, with variations in 5 acquisition protocols, 24 loss function weightings, and 2 ground truths.
Background: User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment.
Method: End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes.
The recent introduction of wireless head-mounted displays (HMD) promises to enhance 3D image visualization by immersing the user into 3D morphology. This work introduces a prototype holographic augmented reality (HAR) interface for the 3D visualization of magnetic resonance imaging (MRI) data for the purpose of planning neurosurgical procedures. The computational platform generates a HAR scene that fuses pre-operative MRI sets, segmented anatomical structures, and a tubular tool for planning an access path to the targeted pathology.
View Article and Find Full Text PDFBackground: This study presents user evaluation studies to assess the effect of information rendered by an interventional planning software on the operator's ability to plan transrectal magnetic resonance (MR)-guided prostate biopsies using actuated robotic manipulators.
Methods: An intervention planning software was developed based on the clinical workflow followed for MR-guided transrectal prostate biopsies. The software was designed to interface with a generic virtual manipulator and simulate an intervention environment using 2D and 3D scenes.
Background And Objective: Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface.
View Article and Find Full Text PDF