Sleep monitoring by polysomnography (PSG) severely degrades sleep quality. In order to reduce the load of sleep monitoring, an approach to automatic sleep stage classification without an electroencephalogram (EEG) was proposed.A total of 124 records from the public dataset ISRUC-Sleep incorporating American Academy of Sleep Medicine (AASM) standards were used: 10 records were from the healthy group while the others were from sleep disorder groups. The 124 records were collected from 116 subjects (eight subjects had two records each, the others had one record each) with ages ranging from 20 to 85 years. A total of 108 features were extracted from the two-channel electrooculograms (EOGs) and six features were extracted from the one-channel. A novel 'quasi-normalization' method was proposed and used for feature normalization. Then the random forest algorithm was used to classify five stages, including wakefulness, rapid eye movement sleep, N1 sleep, N2 sleep and N3 sleep.Using 114 normalized features from the combination of EOG (108 features) and EMG (6 features) data, Cohen's kappa coefficient was 0.749 and the accuracy was 80.8% by leave-one-out cross-validation. As a reference for AASM standards using a computer-assisted method, Cohen's kappa coefficient was 0.801 and the accuracy was 84.7% for the same dataset based on 438 normalized features from a combination of EEG (324 features), EOG (108 features) and EMG (6 features) data.A combination of EOG and EMG can reduce the load of sleep monitoring, and achieves comparable performance to the 'gold standard' signals of EEG, EOG and EMG for sleep stage classification.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1088/1361-6579/ac6bdb | DOI Listing |
NPJ Digit Med
January 2025
Graduate School of Data Science, Seoul National University, Seoul, Republic of Korea.
Polysomnography (PSG) is crucial for diagnosing sleep disorders, but manual scoring of PSG is time-consuming and subjective, leading to high variability. While machine-learning models have improved PSG scoring, their clinical use is hindered by the 'black-box' nature. In this study, we present SleepXViT, an automatic sleep staging system using Vision Transformer (ViT) that provides intuitive, consistent explanations by mimicking human 'visual scoring'.
View Article and Find Full Text PDFDiagnostics (Basel)
January 2025
Department of Dental Prosthetics, Faculty of Dentistry, University of Medicine and Pharmacy of Craiova, 200349 Craiova, Romania.
The study aimed to validate the diagnostic system proposed by the Standardized Tool for the Assessment of Bruxism (STAB) by correlating the results obtained based on questionnaire and non-instrumental and instrumental tools. The study had three stages (questionnaire, clinical examination, and electromyographic study). The subjects completed a questionnaire and clinical exam.
View Article and Find Full Text PDFSci Rep
January 2025
International Institute for Integrative Sleep Medicine (WPI-IIIS), University of Tsukuba, Tsukuba, Ibaraki, 305- 8575, Japan.
We explore an innovative approach to sleep stage analysis by incorporating complexity features into sleep scoring methods for mice. Traditional sleep scoring relies on the power spectral features of electroencephalogram (EEG) and the electromyogram (EMG) amplitude. We introduced a novel methodology for sleep stage classification based on two types of complexity analysis, namely multiscale entropy and detrended fluctuation analysis.
View Article and Find Full Text PDFNurs Rep
January 2025
School of Nursing, University of Minho, 4710-057 Braga, Portugal.
: In Portugal, evidence regarding the mental health of institutionalized older people is limited, leaving this area poorly described and the mental health needs of this population largely unknown. This research aims to describe the mental health of older persons residing in nursing homes in Northern Portugal. : A cross-sectional study will be conducted.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!