摘要
阻塞性睡眠呼吸暂停(obstructive sleep apnea,OSA)预测对于睡眠健康预警至关重要。然而传统方法往往忽略各生理信号的空间和时间依赖性。提出一种基于循环图卷积网络(recurrent graph convolutional network,RGCN)的深度学习框架来挖掘呼吸生理信号间的时空特征。此方法利用图结构对呼吸生理信号数据的关联性进行建模。第一阶段构造了时空循环卷积块,利用图卷积和循环卷积对信号的空间特征和时间特征进行提取,对睡眠呼吸信号进行预测。第二阶段基于卷积操作对预测后的呼吸信号进行OSA事件分类。实验结果表明,RGCN模型捕获了呼吸信号的时空相关性,呼吸信号预测最低MAE和RMSE分别达到1.0613和2.9941,利用预测的呼吸信号进行OSA事件预测的F1达到了89.7%,与其他方法相比效果有明显提升。
The prediction of obstructive sleep apnea(OSA)is crucial to sleep health warning.However,traditional methods often ignore the spatial and temporal dependencies of each physiological signal.A deep learning framework based on recurrent graph convolutional networks(RGCN)was proposed to mine spatiotemporal features between respiratory physiological signals,which used graph structure to model the correlation of respiratory physiological signal data.In the first stage,the spatiotemporal recurrent convolution block was constructed,and the spatial and temporal feature of the signals were extracted by graph convolution and recurrent convolution,and the sleep breathing signals were predicted.In the second stage,OSA events were classified for predicted respiratory signals based on convolution operation.The experimental results showed that the proposed RGCN model could capture the spatiotemporal correlation of respiratory signals,the lowest MAE and RMSE of respiratory signal prediction reached 1.0613 and 2.9941,respectively,and the F1 of OSA event prediction using the predicted respiratory signal reached 89.7%.
作者
张恩铭
袁玥
滕飞
姚远
张海波
ZHANG Enming;YUAN Yue;TENG Fei;YAO Yuan;ZHANG Haibo(School of Computing and Artificial Intelligence,Southwest Jiaotong University,Chengdu 611756,China;West China Hospital,Sichuan University,Chengdu 610041,China;Department of Computer Science,University of Otago,Dunedin 9054,New Zealand)
出处
《郑州大学学报(理学版)》
CAS
北大核心
2023年第6期71-76,共6页
Journal of Zhengzhou University:Natural Science Edition
基金
四川省国际科技创新合作项目(2022YFH0020)。
关键词
阻塞性睡眠呼吸暂停预测
呼吸生理信号
循环图卷积
深度学习
obstructive sleep apnea prediction
respiratory physiological signals
recurrent graph convolution
deep learning