Objective: Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night.
Approach: As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model.
Main Results: Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%.
Significance: We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1088/1361-6579/ab921e | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!