Understanding the interpretation of machine learning (ML) models has been of paramount importance when making decisions with societal impacts, such as transport control, financial activities, and medical diagnosis. While local explanation techniques are popular methods to interpret ML models on a single instance, they do not scale to the understanding of a model's behavior on the whole dataset. In this article, we outline the challenges and needs of visually analyzing local explanations and propose SUBPLEX, a visual analytics approach to help users understand local explanations with subpopulation visual analysis.
View Article and Find Full Text PDFThe COVID-19 pandemic has had profound implications for continuing medical education. Travel restrictions, lockdowns and social distancing in an effort to curb spread have meant that medical conferences have been postponed or cancelled. When the Australian and New Zealand College of Anaesthetists made the decision to commit to a fully virtual 2021 Annual Scientific Meeting, the organising committee investigated the viability of presenting a virtual 'Can't intubate, can't oxygenate' workshop.
View Article and Find Full Text PDF