Multimodal Risk Prediction with Physiological Signals, Medical Images and Clinical Notes.

medRxiv

Department of Computer Science and Engineering, The Ohio State University, Columbus, Ohio 43210, USA.

Published: May 2023

The broad adoption of electronic health records (EHRs) provides great opportunities to conduct healthcare research and solve various clinical problems in medicine. With recent advances and success, methods based on machine learning and deep learning have become increasingly popular in medical informatics. Combining data from multiple modalities may help in predictive tasks. To assess the expectations of multimodal data, we introduce a comprehensive fusion framework designed to integrate temporal variables, medical images, and clinical notes in Electronic Health Record (EHR) for enhanced performance in downstream predictive tasks. Early, joint, and late fusion strategies were employed to effectively combine data from various modalities. Model performance and contribution scores show that multimodal models outperform uni-modal models in various tasks. Additionally, temporal signs contain more information than CXR images and clinical notes in three explored predictive tasks. Therefore, models integrating different data modalities can work better in predictive tasks.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10246140PMC
http://dx.doi.org/10.1101/2023.05.18.23290207DOI Listing

Publication Analysis

Top Keywords

predictive tasks
16
images clinical
12
clinical notes
12
medical images
8
electronic health
8
data modalities
8
tasks
5
multimodal risk
4
risk prediction
4
prediction physiological
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!