Parasitic organisms pose a major global health threat, mainly in regions that lack advanced medical facilities. Early and accurate detection of parasitic organisms is vital to saving lives. Deep learning models have uplifted the medical sector by providing promising results in diagnosing, detecting, and classifying diseases. This paper explores the role of deep learning techniques in detecting and classifying various parasitic organisms. The research works on a dataset consisting of 34,298 samples of parasites such as Toxoplasma Gondii, Trypanosome, Plasmodium, Leishmania, Babesia, and Trichomonad along with host cells like red blood cells and white blood cells. These images are initially converted from RGB to grayscale followed by the computation of morphological features such as perimeter, height, area, and width. Later, Otsu thresholding and watershed techniques are applied to differentiate foreground from background and create markers on the images for the identification of regions of interest. Deep transfer learning models such as VGG19, InceptionV3, ResNet50V2, ResNet152V2, EfficientNetB3, EfficientNetB0, MobileNetV2, Xception, DenseNet169, and a hybrid model, InceptionResNetV2, are employed. The parameters of these models are fine-tuned using three optimizers: SGD, RMSprop, and Adam. Experimental results reveal that when RMSprop is applied, VGG19, InceptionV3, and EfficientNetB0 achieve the highest accuracy of 99.1% with a loss of 0.09. Similarly, using the SGD optimizer, InceptionV3 performs exceptionally well, achieving the highest accuracy of 99.91% with a loss of 0.98. Finally, applying the Adam optimizer, InceptionResNetV2 excels, achieving the highest accuracy of 99.96% with a loss of 0.13, outperforming other optimizers. The findings of this research signify that using deep learning models coupled with image processing methods generates a highly accurate and efficient way to detect and classify parasitic organisms.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10923792PMC
http://dx.doi.org/10.1038/s41598-024-56323-8DOI Listing

Publication Analysis

Top Keywords

deep learning
16
parasitic organisms
16
learning models
12
highest accuracy
12
detecting classifying
8
blood cells
8
vgg19 inceptionv3
8
achieving highest
8
deep
5
learning
5

Similar Publications

Predicting transcriptional changes induced by molecules with MiTCP.

Brief Bioinform

November 2024

Department of Automation, School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Minhang District, Shanghai 200240, China.

Studying the changes in cellular transcriptional profiles induced by small molecules can significantly advance our understanding of cellular state alterations and response mechanisms under chemical perturbations, which plays a crucial role in drug discovery and screening processes. Considering that experimental measurements need substantial time and cost, we developed a deep learning-based method called Molecule-induced Transcriptional Change Predictor (MiTCP) to predict changes in transcriptional profiles (CTPs) of 978 landmark genes induced by molecules. MiTCP utilizes graph neural network-based approaches to simultaneously model molecular structure representation and gene co-expression relationships, and integrates them for CTP prediction.

View Article and Find Full Text PDF

Purpose: The purpose of this study was to develop and validate a deep-learning model for noninvasive anemia detection, hemoglobin (Hb) level estimation, and identification of anemia-related retinal features using fundus images.

Methods: The dataset included 2265 participants aged 40 years and above from a population-based study in South India. The dataset included ocular and systemic clinical parameters, dilated retinal fundus images, and hematological data such as complete blood counts and Hb concentration levels.

View Article and Find Full Text PDF

Purpose: The purpose of this study was to develop a deep learning approach that restores artifact-laden optical coherence tomography (OCT) scans and predicts functional loss on the 24-2 Humphrey Visual Field (HVF) test.

Methods: This cross-sectional, retrospective study used 1674 visual field (VF)-OCT pairs from 951 eyes for training and 429 pairs from 345 eyes for testing. Peripapillary retinal nerve fiber layer (RNFL) thickness map artifacts were corrected using a generative diffusion model.

View Article and Find Full Text PDF

Looking at the world often involves not just seeing things, but feeling things. Modern feedforward machine vision systems that learn to perceive the world in the absence of active physiology, deliberative thought, or any form of feedback that resembles human affective experience offer tools to demystify the relationship between seeing and feeling, and to assess how much of visually evoked affective experiences may be a straightforward function of representation learning over natural image statistics. In this work, we deploy a diverse sample of 180 state-of-the-art deep neural network models trained only on canonical computer vision tasks to predict human ratings of arousal, valence, and beauty for images from multiple categories (objects, faces, landscapes, art) across two datasets.

View Article and Find Full Text PDF

Pancreatic neuroendocrine tumors (PanNETs) are a heterogeneous group of neoplasms that include tumors with different histomorphologic characteristics that can be correlated to sub-categories with different prognoses. In addition to the WHO grading scheme based on tumor proliferative activity, a new parameter based on the scoring of infiltration patterns at the interface of tumor and non-neoplastic parenchyma (tumor-NNP interface) has recently been proposed for PanNET categorization. Despite the known correlations, these categorizations can still be problematic due to the need for human judgment, which may involve intra- and inter-observer variability.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!