Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1162/artl_e_00409 | DOI Listing |
Eur Heart J Digit Health
January 2025
Department of Medicine, Université de Montréal, 5000 Bélanger Street, Montreal, Québec H3T 1J4, Canada.
Front Artif Intell
January 2025
Center for Mind/Brain Sciences, University of Trento, Trento, Italy.
The impressive performance of modern Large Language Models (LLMs) across a wide range of tasks, along with their often non-trivial errors, has garnered unprecedented attention regarding the potential of AI and its impact on everyday life. While considerable effort has been and continues to be dedicated to overcoming the limitations of current models, the potentials and risks of human-LLM collaboration remain largely underexplored. In this perspective, we argue that enhancing the focus on human-LLM interaction should be a primary target for future LLM research.
View Article and Find Full Text PDFDiabetes Care
February 2025
Center for the Study of Aging and Human Development, Duke University, Durham, NC.
J Educ Eval Health Prof
January 2025
President, Korea Health Personnel Licensing Examination Institute, Seoul, Korea.
Eur Radiol
January 2025
Department of Radiology, Seoul National University College of Medicine, Seoul National University Hospital, Seoul, Republic of Korea.
Objective: This study aimed to develop an open-source multimodal large language model (CXR-LLaVA) for interpreting chest X-ray images (CXRs), leveraging recent advances in large language models (LLMs) to potentially replicate the image interpretation skills of human radiologists.
Materials And Methods: For training, we collected 592,580 publicly available CXRs, of which 374,881 had labels for certain radiographic abnormalities (Dataset 1) and 217,699 provided free-text radiology reports (Dataset 2). After pre-training a vision transformer with Dataset 1, we integrated it with an LLM influenced by the LLaVA network.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!