17 results match your criteria: "Md (P.H.Y.); Walter Reed National Military Medical Center (WRNMMC)[Affiliation]"
Radiology
November 2024
From the Department of Computer Science, Johns Hopkins University, 3400 N Charles St, Baltimore, MD 21218 (D.P., A.M., S.S., C.M.H.); Bayesian Health, New York, NY (S.S.); Department of Diagnostic Radiology, University of Maryland School of Medicine, Baltimore, Md (J.J., P.H.Y.); Department of Radiology, St Jude Children's Research Hospital, Memphis, Tenn (P.H.Y.); and Department of Radiology, Johns Hopkins University School of Medicine, Baltimore, Md (C.T.L.).
Background It is unclear whether artificial intelligence (AI) explanations help or hurt radiologists and other physicians in AI-assisted radiologic diagnostic decision-making. Purpose To test whether the type of AI explanation and the correctness and confidence level of AI advice impact physician diagnostic performance, perception of AI advice usefulness, and trust in AI advice for chest radiograph diagnosis. Materials and Methods A multicenter, prospective randomized study was conducted from April 2022 to September 2022.
View Article and Find Full Text PDFRadiology
October 2024
From Drexel University College of Medicine, Philadelphia, Pa (S.M.S.); Department of Radiology, Columbia University Irving Medical Center, New York, NY (J.R.Z.); Department of Radiology, Wake Forest University Health Sciences Center, Winston-Salem, NC (K.H.); Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, Md (J.J., V.P.); and Department of Diagnostic Imaging, St. Jude Children's Research Hospital, 262 Danny Thomas Plc, Memphis, TN 38105-3678 (P.H.Y.).
Background Natural language processing (NLP) is commonly used to annotate radiology datasets for training deep learning (DL) models. However, the accuracy and potential biases of these NLP methods have not been thoroughly investigated, particularly across different demographic groups. Purpose To evaluate the accuracy and demographic bias of four NLP radiology report labeling tools on two chest radiograph datasets.
View Article and Find Full Text PDFRadiology
August 2024
From the University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Radiology and Nuclear Medicine, University of Maryland School of Medicine, 22 S Greene St, Baltimore, MD 21201 (F.X.D., D.S., A.K., P.H.Y., V.S.P.); Department of Radiology, University of Michigan, Ann Arbor, Mich (R.C.C.); and Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, Md (A.J.).
Radiol Artif Intell
May 2024
From the Drexel University College of Medicine, Philadelphia, Pa (S.M.S.); University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 670 W Baltimore St, 1st Fl, Room 1172, Baltimore, MD 21201 (S.M.S., K.P., E.B., V.S.P., P.H.Y.); and Malone Center for Engineering in Healthcare, Johns Hopkins University, Baltimore, Md (P.H.Y.).
Purpose To evaluate the robustness of an award-winning bone age deep learning (DL) model to extensive variations in image appearance. Materials and Methods In December 2021, the DL bone age model that won the 2017 RSNA Pediatric Bone Age Challenge was retrospectively evaluated using the RSNA validation set (1425 pediatric hand radiographs; internal test set in this study) and the Digital Hand Atlas (DHA) (1202 pediatric hand radiographs; external test set). Each test image underwent seven types of transformations (rotations, flips, brightness, contrast, inversion, laterality marker, and resolution) to represent a range of image appearances, many of which simulate real-world variations.
View Article and Find Full Text PDFRadiol Imaging Cancer
March 2024
From the University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 670 W Baltimore St, First Floor, Rm 1172, Baltimore, MD 21201 (H.L.H., A.K.G., J.J., P.H.Y.); The Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine, Baltimore, Md (E.B.A., E.T.O.); Department of Radiology, Division of Breast Imaging, Massachusetts General Hospital, Boston, Mass (M.B.); Malone Center for Engineering in Healthcare, Whiting School of Engineering, Johns Hopkins University, Baltimore, Md (P.H.Y.); and Fischell Department of Bioengineering, A. James Clark School of Engineering, University of Maryland-College Park, College Park, Md (P.H.Y.).
Purpose To evaluate the use of ChatGPT as a tool to simplify answers to common questions about breast cancer prevention and screening. Materials and Methods In this retrospective, exploratory study, ChatGPT was requested to simplify responses to 25 questions about breast cancer to a sixth-grade reading level in March and August 2023. Simplified responses were evaluated for clinical appropriateness.
View Article and Find Full Text PDFRadiol Artif Intell
January 2024
From the Department of Computer Science (J.T.), Department of Biomedical Engineering (J.S.), and Mathematical Institute for Data Science (MINDS) (J.S., J.T.), Johns Hopkins University, 3400 N Charles St, Clark Hall, Suite 320, Baltimore, MD 21218; and University of Maryland Medical Intelligent Imaging Center (UM2ii), Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, Md (P.H.Y.).
Purpose To compare the effectiveness of weak supervision (ie, with examination-level labels only) and strong supervision (ie, with image-level labels) in training deep learning models for detection of intracranial hemorrhage (ICH) on head CT scans. Materials and Methods In this retrospective study, an attention-based convolutional neural network was trained with either local (ie, image level) or global (ie, examination level) binary labels on the Radiological Society of North America (RSNA) 2019 Brain CT Hemorrhage Challenge dataset of 21 736 examinations (8876 [40.8%] ICH) and 752 422 images (107 784 [14.
View Article and Find Full Text PDFRadiology
November 2023
From the University of Maryland Medical Intelligent Imaging (UM2ii) Center, University of Maryland School of Medicine, 670 W Baltimore St, First Floor, Room 1172, Baltimore, MD 21201 (P.B., S.P.G., P.K., A.K., V.S.P., P.H.Y.); Johns Hopkins University School of Medicine, Baltimore, Md (P.B.); Uniformed Services University of the Health Sciences, Bethesda, Md (S.P.G.); and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Md (J.S.).
See also the editorial by Nikolic in this issue.
View Article and Find Full Text PDFRadiol Cardiothorac Imaging
August 2023
University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 22 S Greene St, Baltimore, MD 21201.
Radiol Artif Intell
March 2023
University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 670 W Baltimore St, First Floor, Room 1172, Baltimore, MD 21201 (S.M.S., P.H.Y.); The Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine, Baltimore, Md (N.H.N., V.S.P.); Department of Computer Science, Whiting School of Engineering (V.S.P.), and Malone Center for Engineering in Healthcare (P.H.Y.), Johns Hopkins University, Baltimore, Md.
Purpose: To evaluate the performance and usability of code-free deep learning (CFDL) platforms in creating DL models for disease classification, object detection, and segmentation on chest radiographs.
Materials And Methods: Six CFDL platforms were evaluated in this retrospective study (September 2021). Single- and multilabel classifiers were trained for thoracic pathologic conditions using Guangzhou pediatric and NIH-CXR14 (ie, National Institutes of Health ChestX-ray14) datasets, and external testing was performed using subsets of NIH-CXR14 and Stanford CheXpert datasets, respectively.
Radiology
May 2023
From the University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 670 W Baltimore St, First Floor, Room 1172, Baltimore, MD 21201 (H.L.H., J.J., P.H.Y.); The Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine, Baltimore, Md (E.B.A., E.T.O.); Department of Radiology, Division of Breast Imaging, Massachusetts General Hospital, Boston, Mass (M.B.); Malone Center for Engineering in Healthcare, Whiting School of Engineering, Johns Hopkins University, Baltimore, Md (P.H.Y.); and Fischell Department of Bioengineering, A. James Clark School of Engineering, University of Maryland, College Park, Md (P.H.Y.).
Radiol Artif Intell
November 2022
Department of Anesthesiology, University of Michigan, Ann Arbor, Mich (Z.R.M.); Department of Biomedical Engineering, Johns Hopkins University Whiting School of Engineering, Baltimore, Md (K.V., J.S.); and University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 22 S Greene St, First Floor, Baltimore, MD 21201 (P.H.Y.).
Purpose: To compare performance, sample efficiency, and hidden stratification of visual transformer (ViT) and convolutional neural network (CNN) architectures for diagnosis of disease on chest radiographs and extremity radiographs using transfer learning.
Materials And Methods: In this HIPAA-compliant retrospective study, the authors fine-tuned data-efficient image transformers (DeiT) ViT and CNN classification models pretrained on ImageNet using the National Institutes of Health Chest X-ray 14 dataset (112 120 images) and MURA dataset (14 656 images) for thoracic disease and extremity abnormalities, respectively. Performance was assessed on internal test sets and 75 000 external chest radiographs (three datasets).
Radiol Artif Intell
September 2022
Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Md (K.V., J.S.); and University of Maryland Medical Intelligent Imaging (UM2ii) Center, Department of Radiology and Nuclear Medicine, University of Maryland School of Medicine, 670 W Baltimore St, First Floor, Room 1172, Baltimore, MD 21201 (K.V., S.M.S., P.H.Y.).
Purpose: To evaluate code and data sharing practices in original artificial intelligence (AI) scientific manuscripts published in the Radiological Society of North America (RSNA) journals suite from 2017 through 2021.
Materials And Methods: A retrospective meta-research study was conducted of articles published in the RSNA journals suite from January 1, 2017, through December 31, 2021. A total of 218 articles were included and evaluated for code sharing practices, reproducibility of shared code, and data sharing practices.
Radiol Artif Intell
September 2021
University of Maryland Medical Intelligent Imaging Center, Department of Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, Md (P.H.Y.); Malone Center for Engineering in Healthcare, Whiting School of Engineering, Johns Hopkins University, 601 N Caroline St, Baltimore, MD 21287 (P.H.Y.); and Department of Radiology, New York University Grossman School of Medicine, New York, NY (J.F.).
Radiol Cardiothorac Imaging
June 2021
Russell H. Morgan Department of Radiology, The Johns Hopkins University School of Medicine, 601 N Caroline St, Baltimore, MD 21205-2105.
Purpose: To assess the ability of deep convolutional neural networks (DCNNs) to predict coronary artery calcium (CAC) and cardiovascular risk on chest radiographs.
Materials And Methods: In this retrospective study, 1689 radiographs in patients who underwent cardiac CT and chest radiography within the same year, between 2013 and 2018, were included (mean age, 56 years ± 11 [standard deviation]; 969 radiographs in women). Agatston scores were used as ground truth labels for DCNN training on radiographs.
Radiographics
October 2019
From the Russell H. Morgan Department of Radiology and Radiologic Science (N.G., P.H.Y., K.C.), Sections of Body CT (E.K.F.) and Musculoskeletal Radiology (J.F.), Johns Hopkins Hospital, 601 N Caroline St, Room 3014, Baltimore, MD 21287; and Spine Division, Department of Orthopedics, Balgrist University Hospital Zurich, Zurich, Switzerland (M.F.).
During the past 2 decades, the number of spinal surgeries performed annually has been steadily increasing, and these procedures are being accompanied by a growing number of postoperative imaging studies to interpret. CT is accurate for identifying the location and integrity of implants, assessing the success of decompression and intervertebral arthrodesis procedures, and detecting and characterizing related complications. Although postoperative spinal CT is often limited owing to artifacts caused by metallic implants, parameter optimization and advanced metal artifact reduction techniques, including iterative reconstruction and monoenergetic extrapolation methods, can be used to reduce metal artifact severity and improve image quality substantially.
View Article and Find Full Text PDFRadiology
April 2016
From the National Capital Neuroimaging Consortium (NCNC), Bethesda, Md (G.R., J.S.S., W.L., J.O., E.S., P.H.Y., J.G., D.N., J.C., J.H., V.E., J.M., T.R.O.); National Intrepid Center of Excellence (NICoE), 4860 S Palmer Rd, Bethesda, MD 20889 (G.R., J.S.S., W.L., J.O., E.S., P.H.Y., J.G., D.N., J.C., L.M.F., V.E., J.M., T.R.O.); Center for Neuroscience and Regenerative Medicine, Bethesda, Md (G.R., L.M.F.); Uniformed Services University of the Health Sciences, Bethesda, Md (G.R., A.S., L.M.F.); Henry M. Jackson Foundation for the Advancement of Military Medicine, Bethesda, Md (P.H.Y.); Walter Reed National Military Medical Center (WRNMMC), Bethesda, Md (P.K., L.M.F.); and VA Maryland Health Care System (VAMHCS), Baltimore, Md (J.B.P.).
Purpose: To describe the initial neuroradiology findings in a cohort of military service members with primarily chronic mild traumatic brain injury (TBI) from blast by using an integrated magnetic resonance (MR) imaging protocol.
Materials And Methods: This study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants were military service members or dependents recruited between August 2009 and August 2014.
Radiology
February 2016
From the National Intrepid Center of Excellence (NICoE), Walter Reed National Military Medical Center, 4860 S Palmer Rd, Bethesda, MD 20889-5649 (W.L., K.S., J.S.S., D.J., P.H.Y., J.O., E.B.S., T.R.O., G.R.); Center for Neuroscience and Regenerative Medicine, Bethesda, Md (D.J., T.R.O., G.R.); The Henry M. Jackson Foundation for the Advancement of Military Medicine, Bethesda, Md (D.J.); Biomedical Engineering Department, Cornell University, New York, NY (T.L., Y.W.); and The NorthTide Group, Sterling, Va (W.L., E.B.S.).
Purpose: To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping.
Materials And Methods: The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines.