摘要
针对旋转机械滚动轴承振动加速度信号在采集过程中存在大量噪声导致振动信号表征不明显的问题,提出了一种在对自适应噪声完备集合经验模态分解(CEEMDAN)进行改进的基础上与小波包阈值降噪联合的一种数据去噪方法。首先,通过计算分解后得到的各分量的均方根和互相关系数对与原始信号相关性较大的分量进行提取;然后,分别计算提取后每个分量对应的阈值进行小波包阈值降噪处理;最后,将处理后的信号与分量中不存在噪声的分量进行叠加,即为降噪后的信号。经过对比实验验证,改进CEEMDAN-小波包阈值降噪方法的信噪比最大提高6.41,均方根误差降低0.12,证明了本方法的有效性。
Aiming at the problem of vibration acceleration signals of rolling bearings in rotating machinery are not characterised by a large amount of noise in the acquisition process,data denoising method based on the improvement of the empirical modal decomposition of adaptive noise-complete sets(CEEMDAN)combined with wavelet packet thresholding noise reduction is proposed.Firstly,the components that are more correlated with the original signal are extracted by calculating the root mean square and the number of interrelationships of each component obtained after decomposition;then,the threshold value corresponding to each component after extraction is calculated separately for wavelet packet threshold noise reduction;finally,the processed signal is superimposed with the component that does not have noise,which is the noise reduced signal.After comparison and verification,the signal-to-noise ratio of the improved CEEMDAN-wavelet packet thresholding noise reduction method is improved by 6.41 and the root-mean-square error is reduced by 0.12,which proves the effectiveness of the method in this paper.
作者
周正南
刘美
吴斌鑫
莫常春
高兴泉
张斐
ZHOU Zhengnan;LIU Mei;WU Binxin;GAO Xingquan;ZHANG Fei;无(Jilin of Chemical Technology,Jinlin 132022,China;Guangdong University of Petrochemical Technology,Maoming,Guangdong 525000,China;Dongguan University of Technology,Dongwan,523419,China;Dalian JiaotongUniversity,Dalian,Liaoning 116028,China)
出处
《自动化与仪器仪表》
2023年第4期285-289,共5页
Automation & Instrumentation
基金
国家自然科学基金面上基金(62073091)
广东省高校重点领域(新一代信息技术)专项(2020ZDZX3042)
东莞理工学院机器人与智能装备创新中心(KCYCXPT2017006)
广东省普通高校机器人与智能装备重点实验室,(2017KSYS009)
广东省普通高校特色创新项目“面向大数据应用的车间装备智能化关键技术及应用开发”(2017KTSCX176)
机械设备健康维护湖南省重点实验室开放基金(21903)。
关键词
滚动轴承
模态分解
小波包阈值降噪
数据处理
rolling bearings
fault diagnosis
cuckoo algorithm
extreme learning machine