Removal of electrocardiographic (ECG) artifacts of QRS complexes from a single channel electroencephalography (EEG) and electro-oculography (EOG) can be problematic especially when no reference ECG signal is available. This study examined a simple estimation method excluding the possible QRS part of the EOG trace before spectrum estimation. The method was tested using a simple sleep classifier based on 0.5-30 Hz mean frequency of single channel sleep EOG, with the left EOG electrode referenced to the left mastoid (EOG L-M1). When QRS peaks were automatically excluded from the least square (LS) mean frequency estimation the average optimal mean frequency threshold decreased from 9.3 Hz to 8.8 Hz and agreement and Cohen's Kappa increased respectively from 89% to 90% and from 0.44 to 0.50 when compared to the traditional spectral estimation.

Download full-text PDF

Source
http://dx.doi.org/10.1109/IEMBS.2007.4352359DOI Listing

Publication Analysis

Top Keywords

single channel
8
estimation method
8
eog
5
reducing effects
4
effects electrocardiographic
4
electrocardiographic artifacts
4
artifacts electro-oculography
4
electro-oculography automatic
4
automatic sleep
4
sleep analysis
4

Similar Publications

Manual segmentation of lesions, required for radiotherapy planning and follow-up, is time-consuming and error-prone. Automatic detection and segmentation can assist radiologists in these tasks. This work explores the automated detection and segmentation of brain metastases (BMs) in longitudinal MRIs.

View Article and Find Full Text PDF

Addressing the issues of a single-feature input channel structure, scarcity of training fault data, and insufficient feature learning capabilities in noisy environments for intelligent diagnostic models of mechanical equipment, we propose a method based on a one-dimensional and two-dimensional dual-channel feature information fusion convolutional neural network (1D_2DIFCNN). By constructing a one-dimensional and two-dimensiona dual-channel feature information fusion convolutional network and introducing a Convolutional Block Attention Mechanism, we utilize Random Overlapping Sampling Technique to process raw vibration signals. The model takes as inputs both one-dimensional data and two-dimensional Continuous Wavelet Transform images.

View Article and Find Full Text PDF

For finely representation of complex reservoir units, higher computing overburden and lower spatial resolution are limited to traditional stochastic simulation. Therefore, based on Generative Adversarial Networks (GANs), spatial distribution patterns of regional variables can be reproduced through high-order statistical fitting. However, parameters of GANs cannot be optimized under insufficient training samples.

View Article and Find Full Text PDF

Single-Cell Sequencing and Machine Learning Integration to Identify Candidate Biomarkers in Psoriasis: .

J Inflamm Res

December 2024

Department of Dermatology, China-Japan Friendship Hospital, National Center for Integrative Medicine, Beijing, 100029, People's Republic of China.

Background: Psoriasis represents a persistent, immune-driven inflammatory condition affecting the skin, characterized by a lack of well-established biologic treatments without adverse events. Consequently, the identification of novel targets and therapeutic agents remains a pressing priority in the field of psoriasis research.

Methods: We collected single-cell RNA sequencing (scRNA-seq) datasets and inferred T cell differentiation trajectories through pseudotime analysis.

View Article and Find Full Text PDF

Study Objectives: Polysomnography (PSG) currently serves as the benchmark for evaluating sleep disorders. Its discomfort makes long-term monitoring unfeasible, leading to bias in sleep quality assessment. Hence, less invasive, cost-effective, and portable alternatives need to be explored.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!