Propelled by the synergy of the groundbreaking advancements in the ability to analyze high-dimensional datasets and the increasing availability of imaging and clinical data, machine learning (ML) is poised to transform the practice of cardiovascular medicine. Owing to the growing body of literature validating both the diagnostic performance as well as the prognostic implications of anatomic and physiologic findings, coronary computed tomography angiography (CCTA) is now a well-established non-invasive modality for the assessment of cardiovascular disease. ML has been increasingly utilized to optimize performance as well as extract data from CCTA as well as non-contrast enhanced cardiac CT scans. The purpose of this review is to describe the contemporary state of ML based algorithms applied to cardiac CT, as well as to provide clinicians with an understanding of its benefits and associated limitations.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jcct.2018.04.010 | DOI Listing |
Prenat Diagn
January 2025
Department of Artificial Intelligence, Faculty of Computer Science and Information Technology, Universiti Malaya, Kuala Lumpur, Malaysia.
Objective: The first objective is to develop a nuchal thickness reference chart. The second objective is to compare rule-based algorithms and machine learning models in predicting small-for-gestational-age infants.
Method: This retrospective study involved singleton pregnancies at University Malaya Medical Centre, Malaysia, developed a nuchal thickness chart and evaluated its predictive value for small-for-gestational-age using Malaysian and Singapore cohorts.
Diagn Interv Radiol
January 2025
Erzincan Binali Yıldırım University Faculty of Medicine, Department of Radiology, Erzincan, Türkiye.
Radiography is a field of medicine inherently intertwined with technology. The dependency on technology is very high for obtaining images in ultrasound (US), computed tomography (CT), and magnetic resonance imaging (MRI). Although the reduction in radiation dose is not applicable in US and MRI, advancements in technology have made it possible in CT, with ongoing studies aimed at further optimization.
View Article and Find Full Text PDFDiagn Interv Radiol
January 2025
Huadong Hospital, Fudan University, Department of Thoracic Surgery, Shanghai, China.
Purpose: Patients with advanced non-small cell lung cancer (NSCLC) have varying responses to immunotherapy, but there are no reliable, accepted biomarkers to accurately predict its therapeutic efficacy. The present study aimed to construct individualized models through automatic machine learning (autoML) to predict the efficacy of immunotherapy in patients with inoperable advanced NSCLC.
Methods: A total of 63 eligible participants were included and randomized into training and validation groups.
Anal Methods
January 2025
Jiangsu Beier Machinery Co. Ltd, Jiangsu, 215600, China.
Plastic waste management is one of the key issues in global environmental protection. Integrating spectroscopy acquisition devices with deep learning algorithms has emerged as an effective method for rapid plastic classification. However, the challenges in collecting plastic samples and spectroscopy data have resulted in a limited number of data samples and an incomplete comparison of relevant classification algorithms.
View Article and Find Full Text PDFLiver Int
February 2025
Liver Research Center, Beijing Friendship Hospital, Capital Medical University, Beijing, China.
Background And Aim: Discriminating between idiosyncratic drug-induced liver injury (DILI) and autoimmune hepatitis (AIH) is critical yet challenging. We aim to develop and validate a machine learning (ML)-based model to aid in this differentiation.
Methods: This multicenter cohort study utilised a development set from Beijing Friendship Hospital, with retrospective and prospective validation sets from 10 tertiary hospitals across various regions of China spanning January 2009 to May 2023.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!