Despite many studies done to predict severe coronavirus 2019 (COVID-19) patients, there is no applicable clinical prediction model to predict and distinguish severe patients early. Based on laboratory and demographic data, we have developed and validated a deep learning model to predict survival and assist in the triage of COVID-19 patients in the early stages. This retrospective study developed a survival prediction model based on the deep learning method using demographic and laboratory data. The database consisted of data from 487 patients with COVID-19 diagnosed by the reverse transcription-polymerase chain reaction test and admitted to Imam Khomeini hospital affiliated to Tehran University of Medical Sciences from February 21, 2020, to June 24, 2020. The developed model achieved an area under the curve (AUC) of 0.96 for survival prediction. The results demonstrated the developed model provided high precision (0.95, 0.93), recall (0.90,0.97), and F1-score (0.93,0.95) for low- and high-risk groups. The developed model is a deep learning-based, data-driven prediction tool that can predict the survival of COVID-19 patients with an AUC of 0.96. This model helps classify admitted patients into low-risk and high-risk groups and helps triage patients in the early stages.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9774992 | PMC |
http://dx.doi.org/10.47176/mjiri.36.144 | DOI Listing |
Biomed Phys Eng Express
January 2025
Chiba University Center for Frontier Medical Engineering, 1-33 Yayoi-cho, Inage-ku, Chiba, Chiba, 263-8522, JAPAN.
Traumatic injury remains a leading cause of death worldwide, with traumatic bleeding being one of its most critical and fatal consequences. The use of whole-body computed tomography (WBCT) in trauma management has rapidly expanded. However, interpreting WBCT images within the limited time available before treatment is particularly challenging for acute care physicians.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Computer Science and Mathematics, Lebanese American University, Beirut, Lebanon.
In human activity-recognition scenarios, including head and entire body pose and orientations, recognizing the pose and direction of a pedestrian is considered a complex problem. A person may be traveling in one sideway while focusing his attention on another side. It is occasionally desirable to analyze such orientation estimates using computer-vision tools for automated analysis of pedestrian behavior and intention.
View Article and Find Full Text PDFHypertension is a critical risk factor and cause of mortality in cardiovascular diseases, and it remains a global public health issue. Therefore, understanding its mechanisms is essential for treating and preventing hypertension. Gene expression data is an important source for obtaining hypertension biomarkers.
View Article and Find Full Text PDFPLoS One
January 2025
Engineering Research Center of Hydrogen Energy Equipment& Safety Detection, Universities of Shaanxi Province, Xijing University, Xi'an, China.
The traditional method of corn quality detection relies heavily on the subjective judgment of inspectors and suffers from a high error rate. To address these issues, this study employs the Swin Transformer as an enhanced base model, integrating machine vision and deep learning techniques for corn quality assessment. Initially, images of high-quality, moldy, and broken corn were collected.
View Article and Find Full Text PDFBioinformatics
January 2025
Department of Biology, Emory University, Atlanta, GA 30322, United States.
Motivation: In silico functional annotation of proteins is crucial to narrowing the sequencing-accelerated gap in our understanding of protein activities. Numerous function annotation methods exist, and their ranks have been growing, particularly so with the recent deep learning-based developments. However, it is unclear if these tools are truly predictive.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!