期刊文献+

改进多级混合注意力跳变连接的语音增强算法 被引量:1

Improved speech enhancement algorithm for multi⁃level hybrid attention skip connection
下载PDF
导出
摘要 针对经典语音增强网络在低信噪比环境下语音有丢失、可懂度低且代价函数与评价指标不匹配的问题,提出了一种改进多级混合注意力跳变连接Skip-DNN的语音增强算法。在网络输入层,融合多个互补域的特征,提升模型的泛化能力;在网络中间层,利用全局注意力模块和自注意力模块的多级跳跃连接结构嵌套跨级连接,融合每个网络块提取的重要特征信息,减少特征融合带来的冗余;提出了相关系数联合优化的代价函数,改善指标相关性。实验结果表明,所提方法在语音质量和可懂度上均有显著提升,分别平均提高了0.31和0.05,尤其在低信噪比环境下,具有更好的泛化能力。 Aiming at the problems of the classic speech enhancement network in the low signal⁃to⁃noise ratio environment that the speech is lost,the intelligibility is low,and the cost function does not match the evaluation index,an improved multi⁃level hybrid attention skip connection Skip⁃DNN speech enhancement algorithm is proposed.In the network input layer,the features of multiple complementary domains are merged to improve the generalization ability of the model;In the middle layer of the network,the multi⁃level skip connection of the global attention module and the self⁃attention module is used to fuse the important features extracted by each network block information,reduce the redundancy caused by feature fusion;The cost function of joint optimization of correlation coefficients is proposed to improve the correlation of indicators.Experimental results show that the proposed method has significantly improved speech quality and intelligibility,with an average increase of 0.31 and 0.05,respectively,and has better generalization ability,especially in a low signal⁃to⁃noise ratio environment.
作者 邢璐 李鸿燕 张昱 任健 XING Lu;LI Hongyan;ZHANG Yu;REN Jian(School of Information and Computer,Taiyuan University of Technology,Jinzhong 030600,China)
出处 《电子设计工程》 2023年第8期15-20,共6页 Electronic Design Engineering
基金 山西省自然科学基金项目(201701D121058) 山西省回国留学科研资助项目(2020-042)。
关键词 语音增强 多级混合注意力跳变连接 感知相关代价函数 跳变连接深度神经网络 speech enhancement multi⁃level hybrid attention skip connection perceptual related cost function skip connection deep neural network
  • 相关文献

参考文献7

二级参考文献11

共引文献32

同被引文献5

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部