Sleep Staging for Wearable Electroencephalography Leveraging Machine Learning and Conventional Polysomnography Datasets
No Thumbnail Available
Author
Yilmaz, Gurkan
Martinez, Cristina Sainz
Braun, Fabian
Gnarra, Oriella
Seiler, Andrea
Schindler, Kaspar
Schmidt, Markus H
Lemay, Mathieu
Jorge, João
DOI
10.1109/EMBC58623.2025.11254304
Abstract
Wearable electroencephalography (EEG) sensors are becoming increasingly accessible, enabling long-term, athome automatic sleep profiling, with outstanding potential for personalized medicine and wellness applications. However, current devices remain less reliable for sleep stage classification than gold-standard, lab-based polysomnography (PSG). One of the main bottlenecks is the need to collect extensive datasets for each new wearable device together with a gold standard reference, to train robust algorithms. To tackle this challenge, here, we investigated machine learning and feature engineering techniques on a recent PSG dataset containing multi-channel EEG and reference sleep stage labels, to train classifiers tailored for sleep staging on a separate, wearable EEG headband (ULTEEMNite). We found that the classification performance improved when: (i) training on EEG configurations spatially closer to that of the wearable device, (ii) filtering the training data to approximate the spectral profile of the wearable, and (iii) including information from neighboring signal periods. We also identified the most relevant signal features for classification, across different stages and sensor configurations. These findings enable developing more effective algorithms for sleep staging with wearable EEG, to pursue its vast clinical potential.
Publication Reference
EMBC 2025, Copenhagen (DK)
Year
2025-07-14