Two multilayer neural networks were designed to discriminate vigilance states (waking, paradoxical sleep, and non-REM sleep) in the rat using a single parieto-occipital EEG derivation. After filtering (bandwidth 3.18-25 Hz) and digitization at 512 HZ, the EEG signal was segmented into eight second epochs. Five variables (three statistical, two temporal) were extracted from each epoch. The first network computed an epoch by epoch classification, while the second network also utilized contextual information from contiguous epochs. A specific postprocessing procedure was developed to enhance the vigilance state discrimination of the neural networks designed and especially paradoxical sleep state estimation. The classifications made by the networks (with or without the postprocessing procedure) for six rats were compared to these made by two human experts using EMG and EEG informations on 63,000 epochs. High rates of agreement (> 90%) between humans and neural networks classifications were obtained. In view of its development possibilities and its applicability to other signals, this method could prove of value in biomedical research.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/0031-9384(95)02214-7 | DOI Listing |
BioData Min
January 2025
Department of Statistics, College of Science, Bahir Dar University, P.O. Box 79, Bahir Dar, Ethiopia.
Background: This study employs a LSTM-FC neural networks to address the critical public health issue of child undernutrition in Ethiopia. By employing this method, the study aims classify children's nutritional status and predict transitions between different undernutrition states over time. This analysis is based on longitudinal data extracted from the Young Lives cohort study, which tracked 1,997 Ethiopian children across five survey rounds conducted from 2002 to 2016.
View Article and Find Full Text PDFNat Cancer
January 2025
Institute for Artificial Intelligence in Medicine, University Hospital Essen (AöR), Essen, Germany.
Despite advances in precision oncology, clinical decision-making still relies on limited variables and expert knowledge. To address this limitation, we combined multimodal real-world data and explainable artificial intelligence (xAI) to introduce AI-derived (AID) markers for clinical decision support. We used xAI to decode the outcome of 15,726 patients across 38 solid cancer entities based on 350 markers, including clinical records, image-derived body compositions, and mutational tumor profiles.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Communication Engineering, School of Electronics Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, 632014, India.
The detection of exons is an important area of research in genomic sequence analysis. Many signal-processing methods have been established successfully for detecting the exons based on their periodicity property. However, some improvement is still required to increase the identification accuracy of exons.
View Article and Find Full Text PDFAccurate malaria diagnosis with precise identification of Plasmodium species is crucial for an effective treatment. While microscopy is still the gold standard in malaria diagnosis, it relies heavily on trained personnel. Artificial intelligence (AI) advances, particularly convolutional neural networks (CNNs), have significantly improved diagnostic capabilities and accuracy by enabling the automated analysis of medical images.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Networks and Cybersecurity, Hourani Center for Applied Scientific Research, Al-Ahliyya Amman University, Amman, Jordan.
Diabetic retinopathy stands as a leading cause of blindness among people. Manual examination of DR images is labor-intensive and prone to error. Existing methods to detect this disease often rely on handcrafted features which limit the adaptability and classification accuracy.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!