期刊文献+

Multi-head attention-based long short-term memory model for speech emotion recognition 被引量:1

基于多头注意力长短期记忆模型的语音情感识别方法
下载PDF
导出
摘要 To fully make use of information from different representation subspaces,a multi-head attention-based long short-term memory(LSTM)model is proposed in this study for speech emotion recognition(SER).The proposed model uses frame-level features and takes the temporal information of emotion speech as the input of the LSTM layer.Here,a multi-head time-dimension attention(MHTA)layer was employed to linearly project the output of the LSTM layer into different subspaces for the reduced-dimension context vectors.To provide relative vital information from other dimensions,the output of MHTA,the output of feature-dimension attention,and the last time-step output of LSTM were utilized to form multiple context vectors as the input of the fully connected layer.To improve the performance of multiple vectors,feature-dimension attention was employed for the all-time output of the first LSTM layer.The proposed model was evaluated on the eNTERFACE and GEMEP corpora,respectively.The results indicate that the proposed model outperforms LSTM by 14.6%and 10.5%for eNTERFACE and GEMEP,respectively,proving the effectiveness of the proposed model in SER tasks. 针对语音情感识别中不同表征空间的信息利用不足问题,提出了一种多头注意力的双层长短时记忆模型,用于充分挖掘有效的情感信息.该模型以具有时序情感信息的帧级别特征作为输入值,利用长短时记忆模块学习时域特征,设计了特征注意力模块和时间多头注意力模块,对长短时记忆模块的逐层输出值、特征注意力模块输出值、时间多头注意力模块输出值进行融合.结果表明,相比传统的长短时记忆模型,所提方法在eENTERFACE和GEMEP两个数据集上的识别准确率分别提升了14.6%和10.5%,从而证明了其在语音情感识别任务中的有效性.
作者 Zhao Yan Zhao Li Lu Cheng Li Sunan Tang Chuangao Lian Hailun 赵焱;赵力;路成;李溯南;唐传高;连海伦(东南大学信息科学与工程学院,南京210096;东南大学生物科学与医学工程学院,南京210096)
出处 《Journal of Southeast University(English Edition)》 EI CAS 2022年第2期103-109,共7页 东南大学学报(英文版)
基金 The National Natural Science Foundation of China(No.61571106,61633013,61673108,81871444).
关键词 speech emotion recognition long short-term memory(LSTM) multi-head attention mechanism frame-level features self-attention 语音情感识别 长短期记忆 多头注意力机制 帧级别特征 自注意力
  • 相关文献

同被引文献8

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部