Purpose: Three-dimensional (3D) reconstructions of the human anatomy have been available for surgery planning or diagnostic purposes for a few years now. The different image modalities usually rely on several consecutive two-dimensional (2D) acquisitions in order to reconstruct the 3D volume. Hence, such acquisitions are expensive, time-demanding and often expose the patient to an undesirable amount of radiation. For such reasons, along the most recent years, several studies have been proposed that extrapolate 3D anatomical features from merely 2D exams such as x rays for implant templating in total knee or hip arthroplasties.

Method: The presented study shows an adaptation of a deep learning-based convolutional neural network to reconstruct 3D volumes from a mere 2D digitally reconstructed radiograph from one of the most extensive lower limb computed tomography datasets available. This novel approach is based on an encoder-decoder architecture with skip connections and a multidimensional Gaussian filter as data augmentation technique.

Results: The results achieved promising values when compared against the ground truth volumes, quantitatively yielding an average of 0.77 ± 0.05 structured similarity index.

Conclusions: A novel deep learning-based approach to reconstruct 3D medical image volumes from a single x-ray image was shown in the present study. The network architecture was validated against the original scans presenting SSIM values of 0.77 ± 0.05 and 0.78 ± 0.06, respectively for the knee and the hip crop.

Download full-text PDF

Source
http://dx.doi.org/10.1002/mp.14835DOI Listing

Publication Analysis

Top Keywords

image volumes
8
digitally reconstructed
8
lower limb
8
knee hip
8
deep learning-based
8
three-dimensional image
4
volumes
4
volumes two-dimensional
4
two-dimensional digitally
4
reconstructed radiographs
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!