A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Neural network-assisted automated image registration for MRI-guided adaptive brachytherapy in cervical cancer. | LitMetric

Purpose: In image-guided adaptive brachytherapy (IGABT) a quantitative evaluation of the dosimetric changes between fractions due to anatomical variations, can be implemented via rigid registration of images from subsequent fractions based on the applicator as a reference structure. With available treatment planning systems (TPS), this is a manual and time-consuming process. The aim of this retrospective study was to automate this process. A neural network (NN) was trained to predict the applicator structure from MR images. The resulting segmentation was used to automatically register MR-volumes.

Material And Methods: DICOM images and plans of 56 patients treated for cervical cancer with high dose-rate (HDR) brachytherapy were used in the study. A 2D and a 3D NN were trained to segment applicator structures on clinical T2-weighted MRI datasets. Different rigid registration algorithms were investigated and compared. To evaluate a fully automatic registration workflow, the NN-predicted applicator segmentations (AS) were used for rigid image registration with the best performing algorithm. The DICE coefficient and mean distance error between dwell positions (MDE) were used to evaluate segmentation and registration performance.

Results: The mean DICE coefficient for the predicted AS was 0.70 ± 0.07 and 0.58 ± 0.04 for the 3D NN and 2D NN, respectively. Registration algorithms achieved MDE errors from 8.1 ± 3.7 mm (worst) to 0.7 ± 0.5 mm (best), using ground-truth AS. Using the predicted AS from the 3D NN together with the best registration algorithm, an MDE of 2.7 ± 1.4 mm was achieved.

Conclusion: Using a combination of deep learning models and state of the art image registration techniques has been demonstrated to be a promising solution for automatic image registration in IGABT. In combination with auto-contouring of organs at risk, the auto-registration workflow from this study could become part of an online-dosimetric interfraction evaluation workflow in the future.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9948828PMC
http://dx.doi.org/10.1016/j.zemedi.2022.04.002DOI Listing

Publication Analysis

Top Keywords

image registration
16
registration
10
adaptive brachytherapy
8
cervical cancer
8
rigid registration
8
registration algorithms
8
dice coefficient
8
neural network-assisted
4
network-assisted automated
4
image
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!