期刊文献+

神经机器翻译系统在维吾尔语-汉语翻译中的性能对比 被引量:25

Performance comparison of neural machine translation systems in Uyghur-Chinese translation
原文传递
导出
摘要 基于深度学习的神经机器翻译已在多个语言对上显著超过传统的统计机器翻译,成为当前的主流机器翻译技术。该文从词粒度层面出发,对国际上具有影响力的6种神经机器翻译方法在维吾尔语-汉语翻译任务上进行了深入分析和比较,这6种方法分别是基于注意力机制(GroundHog),词表扩大(LV-groundhog),源语言和目标语言采用子词(subword-nmt)、字符与词混合(nmt.hybrid)、子词与字符(dl4mt-cdec)以及完全字符(dl4mt-c2c)方法。实验结果表明:源语言采用子词、目标语言采用字符的方法(dl4mtcdec)在维吾尔语-汉语神经机器翻译任务上性能最佳。该文不仅是首次将神经机器翻译方法应用到维吾尔语-汉语机器翻译任务上,也是首次将不同的神经机器翻译方法在同一语料库上进行了对比分析。该研究对维吾尔语-汉语机器翻译任务和神经机器翻译的进一步研究工作都具有重要的参考意义。 The neural machine translation based on deep learning significantly surpasses the traditional statistical machine translation in many languages, and becomes the current mainstream machine translation technology. This paper compares six influential neural machine translation methods from the level of word granularity in the task of Uyghur-Chinese machine translation. These methods are attention mechanism (GroundHog), vocabulary expansion (LV-groundhog), source language and target language with subword units (subword-nmt), characters and words mixed (nmt. hybrid), subword units and characters (dl4mt cdec), and complete characters (dl4mt-c2c). The experimental results show that Uyghur Chinese neural machinetranslation performs best when the source language is segmented into subword units and the target language is represented by characters (dl4mt-cdee). This paper is the first to use neural machine translation for Uyghur Chinese machine translation and the first to compare different neural machine translation methods on the same corpus. This work is an important reference not only for Uyghur-Chinese machine translation, but also for general neural machine translation tasks.
出处 《清华大学学报(自然科学版)》 EI CAS CSCD 北大核心 2017年第8期878-883,共6页 Journal of Tsinghua University(Science and Technology)
基金 国家自然科学基金重点项目(61331013) 国家"八六三"高技术项目(2015AA015407)
关键词 神经机器翻译 资源匮乏语言 维吾尔语 neural machine translation low resource language Uyghur
  • 相关文献

参考文献1

二级参考文献1

共引文献15

同被引文献114

引证文献25

二级引证文献98

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部