期刊文献+

基于注意力机制的手语语序转换方法

A word order conversion method based attention mechanism for sign language
下载PDF
导出
摘要 文章考虑听障人士与健全人士在语法和语言结构上的差异,设计一种基于注意力机制的手语语序转换器,实现手语语序到书面表达的转换。语序转换器在编码阶段使用双向长短期记忆网络(long short-term memory,LSTM)提取手语语序特征,解码阶段使用一维卷积提取编码器隐藏状态的特征,并利用注意力机制避免了长距离的依赖问题,从而得到书面表达。实验结果表明,语序转换器准确率最高为92.64%。 Considering the differences in grammar and language structure between hearing-impaired and able-bodied people,this paper designs a sign language order converter based on attention mechanism to realize the conversion of sign language order to written expression.In the encoding stage,the word order converter uses bidirectional long short-term memory(LSTM)to extract the features of the sign language order,and in the decoding stage,it uses one-dimensional convolution to extract the features of the hidden state of the encoder.In addition,the attention mechanism is used to avoid the long-distance dependency problem,so as to obtain the written expression.The results show that the highest accuracy of the word order converter is 92.64%.
作者 张哲岩 王青山 ZHANG Zheyan;WANG Qingshan(School of Mathematics,Hefei University of Technology,Hefei 230601,China)
出处 《合肥工业大学学报(自然科学版)》 CAS 北大核心 2023年第1期42-46,59,共6页 Journal of Hefei University of Technology:Natural Science
基金 国家自然科学基金资助项目(61571179)。
关键词 注意力机制 语序转换 编解码模型 特征提取 attention mechanism word order conversion encoder-decoder model feature extraction
  • 相关文献

参考文献5

二级参考文献55

共引文献23

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部