Toggle Main Menu Toggle Search

Open Access padlockePrints

Making Sense of Sleep: Multimodal Sleep Stage Classification in a Large, Diverse Population Using Movement and Cardiac Sensing

Lookup NU author(s): Dr Bing Zhai, Dr Yu GuanORCiD

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

© 2020 ACM.Traditionally, sleep monitoring has been performed in hospital or clinic environments, requiring complex and expensive equipment set-up and expert scoring. Wearable devices increasingly provide a viable alternative for sleep monitoring and are able to collect movement and heart rate (HR) data. In this work, we present a set of algorithms for sleep-wake and sleep-stage classification based upon actigraphy and cardiac sensing amongst 1,743 participants. We devise movement and cardiac features that could be extracted from research-grade wearable sensors and derive models and evaluate their performance in the largest open-access dataset for human sleep science. Our results demonstrated that neural network models outperform traditional machine learning methods and heuristic models for both sleep-wake and sleep-stage classification. Convolutional neural networks (CNNs) and long-short term memory (LSTM) networks were the best performers for sleep-wake and sleep-stage classification, respectively. Using SHAP (SHapley Additive exPlanation) with Random Forest we identified that frequency features from cardiac sensors are critical to sleep-stage classification. Finally, we introduced an ensemble-based approach to sleep-stage classification, which outperformed all other baselines, achieving an accuracy of 78.2% and F1 score of 69.8% on the classification task for three sleep stages. Together, this work represents the first systematic multimodal evaluation of sleep-wake and sleep-stage classification in a large, diverse population. Alongside the presentation of an accurate sleep-stage classification approach, the results highlight multimodal wearable sensing approaches as scalable methods for accurate sleep-classification, providing guidance on optimal algorithm deployment for automated sleep assessment. The code used in this study can be found online at: https://github.com/bzhai/multimodal-sleep-stage-benchmark.git.


Publication metadata

Author(s): Zhai B, Perez-Pozuelo I, Clifton EAD, Palotti J, Guan Y

Publication type: Article

Publication status: Published

Journal: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

Year: 2020

Volume: 4

Issue: 2

Pages: 1-33

Online publication date: 15/06/2020

Acceptance date: 02/04/2018

ISSN (electronic): 2474-9567

Publisher: Association for Computing Machinery

URL: https://doi.org/10.1145/3397325

DOI: 10.1145/3397325


Altmetrics

Altmetrics provided by Altmetric


Funding

Funder referenceFunder name
EPSRC

Share