Objective: To develop and validate an artificial intelligence (AI)-driven tool for automated segmentation of the pulp cavity system of mandibular molars on cone-beam computed tomography (CBCT) images.
Materials And Methods: After ethical approval, 66 CBCT scans were retrieved from a hospital database and divided into training (n = 26, 86 molars), validation (n = 7, 20 molars), and testing (n = 33, 60 molars) sets. After automated segmentation, an expert evaluated the quality of the AI-driven segmentations. The expert then refined any under- or over-segmentation to produce refined-AI (R-AI) segmentations. The AI and R-AI 3D models were compared to assess the accuracy. 30% of the testing sample was randomly selected to assess accuracy metrics and conduct time analysis.
Results: The AI-driven tool achieved high accuracy, with a Dice similarity coefficient (DSC) of 88% ± 7% for first molars and 90% ± 6% for second molars (p > .05). The 95% Hausdorff distance (HD) was lower for AI-driven segmentation (0.13 ± 0.07) compared to manual segmentation (0.21 ± 0.08) (p < .05). Regarding time efficiency, AI-driven (4.3 ± 2 s) and R-AI segmentation (139 ± 93 s) methods were the fastest, compared to manual segmentation (2349 ± 444 s) (p < .05).
Conclusion: The AI-driven segmentation proved to be accurate and time-efficient in segmenting the pulp cavity system in mandibular molars.
Clinical Relevance: Automated segmentation of the pulp cavity system may result in a fast and accurate 3D model, facilitating minimal-invasive endodontics and leading to higher efficiency of the endodontic workflow, enabling anticipation of complications.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11582138 | PMC |
http://dx.doi.org/10.1007/s00784-024-06009-2 | DOI Listing |
Microsc Res Tech
January 2025
Artificial Intelligence and Data Analytics (AIDA) lab, CCIS Prince Sultan University, Riyadh, Saudi Arabia.
Microscopic imaging aids disease diagnosis by describing quantitative cell morphology and tissue size. However, the high spatial resolution of these images poses significant challenges for manual quantitative evaluation. This project proposes using computer-aided analysis methods to address these challenges, enabling rapid and precise clinical diagnosis, course analysis, and prognostic prediction.
View Article and Find Full Text PDFAbdom Radiol (NY)
December 2024
Department of Radiology, Mayo Clinic, Rochester, MN, USA.
Pancreatic ductal adenocarcinoma (PDAC) is the third leading cause of cancer-related deaths in the United States, largely due to its poor five-year survival rate and frequent late-stage diagnosis. A significant barrier to early detection even in high-risk cohorts is that the pancreas often appears morphologically normal during the pre-diagnostic phase. Yet, the disease can progress rapidly from subclinical stages to widespread metastasis, undermining the effectiveness of screening.
View Article and Find Full Text PDFLancet Digit Health
January 2025
Department of Radiation Oncology, Brigham and Women's Hospital, Dana-Farber Cancer Institute, Harvard Medical School, Boston, MA, USA; Artificial Intelligence in Medicine (AIM) Program, Mass General Brigham, Harvard Medical School, Boston, MA, USA. Electronic address:
Background: Palliative spine radiation therapy is prone to treatment at the wrong anatomic level. We developed a fully automated deep learning-based spine-targeting quality assurance system (DL-SpiQA) for detecting treatment at the wrong anatomic level. DL-SpiQA was evaluated based on retrospective testing of spine radiation therapy treatments and prospective clinical deployment.
View Article and Find Full Text PDFBiosens Bioelectron
March 2025
Department of Biotechnology, National Formosa University, No. 64, Wunhua Rd, Huwei Township, Yunlin County, 63201, Taiwan. Electronic address:
The EZ DEVICE is an integrated fluorescence microflow cytometer designed for automated cell phenotyping and enumeration using artificial intelligence (AI). The platform consists of a laser diode, optical filter, objective lens, CMOS image sensor, and microfluidic chip, enabling automated sample pretreatment, labeling, and detection within a single compact unit. AI algorithms segment and identify objects in images captured by the CMOS sensor at 532 and 586 nm emission wavelengths.
View Article and Find Full Text PDFNeurooncol Adv
December 2024
Department of Neurosurgery, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA.
Background: Fully automatic skull-stripping and tumor segmentation are crucial for monitoring pediatric brain tumors (PBT). Current methods, however, often lack generalizability, particularly for rare tumors in the sellar/suprasellar regions and when applied to real-world clinical data in limited data scenarios. To address these challenges, we propose AI-driven techniques for skull-stripping and tumor segmentation.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!