期刊文献+

基于短时能量和最小相对均方误差准则的神经网络语音水印方法 被引量:1

Neural Network Speech Watermarking Method Based on Short-Term Energy and Least Relative Mean Square Error Criterion
下载PDF
导出
摘要 针对传统最小均方误差(Least mean square error,LMS)和最小二乘准则(Recursive least squares,RLS)的神经网络语音水印的局限性,提出了基于短时能量和最小相对均方误差(Least relative mean square error,LRMS)准则的神经网络语音水印算法。首先在首帧语音中嵌入同步序列,然后求出每帧的短时能量并对大于设定阈值的语音帧进行小波变换,最后利用以LRMS准则构建的神经网络实现水印的嵌入和提取。通过合理设定短时能量阈值,实现了水印容量和鲁棒性的平衡,而采用Levenberg-Marguardt(LM)算法迅速地让网络收敛。理论分析和实验结果表明,与文献[8]相比,本文提出的神经网络方案收敛速度更快,对于噪声、低通滤波、重采样和重量化等攻击有更强的鲁棒性,性能平均提高了5%。 Abstract: In order to overcome the weakness of least mean square error (LMS) and the recur- sive least squares(RLS), a new neural network speech watermarking method based on short- term energy and least relative mean square error(LRMS) is proposed. Firstly, a synchroniza- tion sequence is embedded into the first frame of the speech. In addition, the short-term energy of each frame is calculated and discrete wavelet transform(DWT) is performed for the speech frame larger than the threshold. Finally, the watermark is embedded and extracted via the trained LRMS based neural network. The balance of the watermarking capacity and robustness is achieved by setting a reasonable short-term energy threshold and the network converges fast by Levenberg-Marguardt(LM) algorithm. The theoretical analysis and the experimental re- suits show that, compared with reference [8], the improved neural network scheme converges faster and gets better robustness against attacks such as additive noise, low-pass filtering, re- sampling, re-quantifying, et al. Moreover, the performance achieves 5% increase on average.
出处 《数据采集与处理》 CSCD 北大核心 2014年第2期254-258,共5页 Journal of Data Acquisition and Processing
基金 国家自然科学基金(61072042)资助项目
关键词 短时能量 最小相对均方误差 小波变换 Levenberg—Marguardt算法 short-term energy least relative mean square error discrete wavelet trans{orm Levenberg-Marguardt (LM) algorithm
  • 相关文献

参考文献4

二级参考文献41

共引文献97

同被引文献7

引证文献1

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部