摘要
针对现有滚动轴承剩余使用寿命(RUL)预测方法特征提取能力单一,无法充分利用数据中蕴含的时空信息等问题,提出了一种基于自注意力卷积神经网络(CNN)和双向长短期记忆网络(BiLSTM)的RUL预测方法。将振动信号的不同时域指标输入改进的自注意力CNN模块,提取不同指标间的空间特征信息并进行自注意力加权以强化特征提取效果,然后通过BiLSTM层提取时序数据中的退化特征信息并经过全连接层后输出轴承的RUL预测值。使用FEMTO-ST滚动轴承数据集进行验证的结果表明,相比CNN,BiLSTM和CNN-BiLSTM模型,自注意力CNN-BiLSTM模型的RUL预测误差更低,性能评价指标更好,CNN与BiLSTM的融合以及自注意力机制的应用使模型的预测精度提高且更倾向于进行超前预测,有利于开展预测性维修。
A remaining useful life(RUL)prediction method based on self-attention convolution neural network(CNN)and bidirectional long short-term memory(BiLSTM)is proposed to address the problems of limited feature extraction ability and insufficient utilization of spatial-temporal information contained in data of existing RUL prediction methods for rolling bearings.The different time domain indexes of vibration signals are input into improved self-attention CNN module,so the spatial feature information among different indexes is extracted and the self-attention weighting is used to enhance the feature extraction effect.Then,the degraded feature information in time series data is extracted through BiLSTM layer and the RUL prediction value of the bearings is output through fully connected layer.The validation results using FEMTO-ST rolling bearing dataset show that compared with CNN,BiLSTM and CNN-BiLSTM models,the proposed method has lower RUL prediction error and better performance evaluation indexes.The fusion of CNN and BiLSTM and the application of self-attention mechanism improve the prediction accuracy of the model and tend to make advanced predictions,which is conducive to predictive maintenance.
作者
惠憬明
王健
吴双
黄永明
王梓齐
HUI Jingming;WANG Jian;WU Shuang;HUANG Yongming;WANG Ziqi(Inner Mongolia Huomeihongjun Aluminium&Electricity Co.,Ltd.,Tongliao 029200,China;Hunan Zhongrong Huizhi Information Technology Co.,Ltd.,Changsha 410221,China;College of Control Science and Engineering,Zhejiang University,Hangzhou 310027,China)
出处
《轴承》
北大核心
2024年第3期92-98,共7页
Bearing
基金
国家自然科学基金面上项目(61973269)。
关键词
滚动轴承
剩余使用寿命
寿命预测
深度学习
卷积神经网络
双向长短期记忆网络
自注意力
rolling bearing
remaining useful life
life prediction
deep learning
convolution neural network
bidirectional long short-term memory
self-attention