期刊文献+

基于CNN-NLSTM的脑电信号注意力状态分类方法

EEG-based Attention States Classification via CNN-NLSTM Model
下载PDF
导出
摘要 通过脑电信号进行注意力状态检测,对扩大脑-机接口技术的应用范围具有重要意义。为了提高注意力状态的分类准确率,该文提出一种基于CNN-NLSTM的脑电信号分类模型。首先采用Welch方法获得脑电信号的功率谱密度特征并将其表示为二维灰度图像。然后使用卷积神经网络从灰度图像中学习表征注意力状态的特征,并将相关特征输入到嵌套长短时记忆神经网络依次获得所有时间步骤的注意力特征。最后将两个网络依次连接来构建深度学习框架进行注意力状态分类。实验结果表明,该文所提出的模型通过进行多次5-折交叉验证评估后得到89.26%的平均分类准确率和90.40%的最大分类准确率,与其他模型相比具有更好的分类效果和稳定性。 Electroencephalogram(EEG)-based attention states detection is of great significance for expanding the application the brain-computer interface.In this paper,a classification approach is presented to improve the accuracy of EEG-based attention states classification via the Convolutional Neural Network and Nested Long Short-term Memory(CNN-NLSTM)model.First,the power spectral density of the EEG signals is obtained by the Welch method and represented as a two-dimensional grayscale image.Then,the CNN is used to learn features that represent attention states from grayscale images,and resulted features are input into the NLSTM neural network to sequentially obtain attention characteristics for all time steps.Finally,the two networks are connected to build a deep learning framework for attentional states classification.The experimental results show that the proposed model evaluated by multiple 5-fold cross-validation outperforms other models by an average accuracy of 89.26%and a maximum accuracy of 90.40%.
作者 沈振乾 李文强 任甜甜 王瑶 赵慧娟 SHEN Zhenqian;LI Wenqiang;REN Tiantian;WANG Yao;ZHAO Huijuan(School of Electronics and Information Engineering,Tiangong University,Tianjin 300387,China;School of Life Sciences,Tiangong University,Tianjin 300387,China)
出处 《中文信息学报》 CSCD 北大核心 2024年第4期38-49,共12页 Journal of Chinese Information Processing
基金 国家自然科学基金(61701342) 天津市科技计划项目(22KPXMRC00060)。
关键词 注意力状态 脑电信号 卷积神经网络 嵌套长短时记忆神经网络 功率谱密度 attention state electroencephalogram signal convolutional neural network nested long short-term memory network power spectral density
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部