期刊文献+

倾向近邻关联的神经机器翻译 被引量:4

Neural Machine Translation Inclined to Close Neighbor Association
下载PDF
导出
摘要 现有神经机器翻译模型在对序列建模时,仅考虑目标端对应源端的关联性,未对源端关联性及目标端关联性建模。文中分别对源端以及目标端关联性建模,并设计合理的损失函数,使得源端隐藏层与其近邻K个单词隐藏层更相关,目标端隐藏层与其历史M个单词隐藏层更相关。在大规模中英数据集上的实验结果表明,相比于神经机器翻译中仅考虑目标端对应源端的关联性,所提方法可以构建更好的近邻关联表示,提升机器翻译系统的译文质量。 The existing neural machine translation model only considers the relevance of the target end corresponding to the source end when modeling the sequences,and does not model the source end association and the target end association.In this paper,the source and target associations were modeled separately,and a reasonable loss function was designed.The source-hidden layer is more related to its neighboring K word-hidden layers.The target-side hidden layer is more related to its historical M word-hidden layers.The experimental results on the large-scale Chinese-English dataset show that compared with the neural machine translation which only considers the relevance of the target end to the source,the proposed method can construct a better neighbor correlation representation and improve the translation qua-lity of the machine translation system.
作者 王坤 段湘煜 WANG Kun;DUAN Xiang-yu(School of Computer Science & Technology,Soochow University,Suzhou,Jiangsu 215006,China)
出处 《计算机科学》 CSCD 北大核心 2019年第5期198-202,共5页 Computer Science
基金 国家自然科学基金(61673289) 国家重点研发计划"政府间国际科技创新合作"重点专项(2016YFE0132100)资助
关键词 机器翻译 近邻关联 注意力机制 Machine translation Close neighbor association Attention machanism
  • 相关文献

同被引文献44

引证文献4

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部