Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10441939 | PMC |
http://dx.doi.org/10.1007/s12055-023-01568-7 | DOI Listing |
Front Med (Lausanne)
January 2025
Clinical Informatics Fellowship Program, Baylor Scott & White Health, Round Rock, TX, United States.
Generative artificial intelligence (GenAI) is rapidly transforming various sectors, including healthcare and education. This paper explores the potential opportunities and risks of GenAI in graduate medical education (GME). We review the existing literature and provide commentary on how GenAI could impact GME, including five key areas of opportunity: electronic health record (EHR) workload reduction, clinical simulation, individualized education, research and analytics support, and clinical decision support.
View Article and Find Full Text PDFDigit Health
January 2025
Independent Researcher, Calgary, Alberta, Canada.
Digital health (DH) and artificial intelligence (AI) in healthcare are rapidly evolving but were addressed synonymously by many healthcare authorities and practitioners. A deep understanding and clarification of these concepts are fundamental and a prerequisite for developing robust frameworks and practical guidelines to ensure the safety, efficacy, and effectiveness of DH solutions and AI-embedded technologies. Categorizing DH into technologies (DHTs) and services (DHSs) enables regulatory, HTA, and reimbursement bodies to develop category-specific frameworks and guidelines for evaluating these solutions effectively.
View Article and Find Full Text PDFJMIR Ment Health
January 2025
The Samueli Initiative for Responsible AI in Medicine, Tel Aviv University, Tel Aviv, Israel.
Generative artificial intelligence (GenAI) shows potential for personalized care, psychoeducation, and even crisis prediction in mental health, yet responsible use requires ethical consideration and deliberation and perhaps even governance. This is the first published theme issue focused on responsible GenAI in mental health. It brings together evidence and insights on GenAI's capabilities, such as emotion recognition, therapy-session summarization, and risk assessment, while highlighting the sensitive nature of mental health data and the need for rigorous validation.
View Article and Find Full Text PDFPatient Educ Couns
January 2025
Wiser Healthcare, Sydney School of Public Health, Faculty of Medicine and Health, The University of Sydney, Australia; The Daffodil Centre, The University of Sydney, a joint venture with Cancer Council NSW, NSW, Australia.
Objective: This study aimed to assess whether information from AI chatbots on benefits and harms of breast and prostate cancer screening were concordant with evidence-based cancer screening recommendations.
Methods: Seven unique prompts (four breast cancer; three prostate cancer) were presented to ChatGPT in March 2024. A total of 60 criteria (30 breast; 30 prostate) were used to assess the concordance of information.
Life (Basel)
January 2025
Department of Hand and Plastic Surgery, Thurgau Hospital Group, 8501 Frauenfeld, Switzerland.
AI, especially ChatGPT, is impacting healthcare through applications in research, patient communication, and training. To our knowledge, this is the first study to examine ChatGPT-4's ability to analyze images of lower leg defects and assesses its understanding of complex case reports in comparison to the performance of board-certified surgeons and residents. We conducted a cross-sectional survey in Switzerland, Germany, and Austria, where 52 participants reviewed images depicting lower leg defects within fictitious patient profiles and selected the optimal reconstruction techniques.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!