IEEE J Biomed Health Inform
November 2024
Uncertainty quantification is critical for ensuring the safety of deep learning-enabled health diagnostics, as it helps the model account for unknown factors and reduces the risk of misdiagnosis. However, existing uncertainty quantification studies often overlook the significant issue of class imbalance, which is common in medical data. In this paper, we propose a class-balanced evidential deep learning framework to achieve fair and reliable uncertainty estimates for health diagnostic models.
View Article and Find Full Text PDFAnnu Int Conf IEEE Eng Med Biol Soc
July 2023
Supervised machine learning (ML) is revolutionising healthcare, but the acquisition of reliable labels for signals harvested from medical sensors is usually challenging, manual, and costly. Active learning can assist in establishing labels on-the-fly by querying the user only for the most uncertain -and thus informative- samples. However, current approaches rely on naive data selection algorithms, which still require many iterations to achieve the desired accuracy.
View Article and Find Full Text PDFAnnu Int Conf IEEE Eng Med Biol Soc
July 2022
Deep learning techniques are increasingly used for decision-making in health applications, however, these can easily be manipulated by adversarial examples across different clinical domains. Their security and privacy vulnerabilities raise concerns about the practical deployment of these systems. The number and variety of the adversarial attacks grow continuously, making it difficult for mitigation approaches to provide effective solutions.
View Article and Find Full Text PDF