Techniques of artificial intelligence (AI) are increasingly used in the treatment of patients, such as providing a diagnosis in radiological imaging, improving workflow by triaging patients or providing an expert opinion based on clinical symptoms; however, such AI techniques also hold intrinsic risks as AI algorithms may point in the wrong direction and constitute a black box without explaining the reason for the decision-making process.This article outlines a case where an erroneous ChatGPT diagnosis, relied upon by the patient to evaluate symptoms, led to a significant treatment delay and a potentially life-threatening situation. With this case, we would like to point out the typical risks posed by the widespread application of AI tools not intended for medical decision-making.
View Article and Find Full Text PDFBackground: Practice-oriented phases, such as the mandatory clinical traineeships and the final clinical internship, are of great importance in the teaching curriculum and skilful learning of medical students.
Aim: With respect to the practical phases, such as clinical clerkship and medical internship, the concept of two innovative courses to prepare and evaluate these crucial training sections is presented including initial experiences from teaching practice.
Method: A narrative review is given.
Against the background of the current pandemic crisis, this case report presents the experiences made from interprofessional teamwork with group members from different medical qualification levels. Our objectives were to identify areas of shared knowledge regarding efficient collaboration; to improve effective teamwork based on mutual respect; to develop innovative teaching methods tailored to the needs of COVID-19 interprofessional response teams. Field notes from numerous team discussions and improvised internal training sessions were compiled into a checklist.
View Article and Find Full Text PDF