摘要
为进一步提升多语言场景下的英语翻译实现效果,基于跨模态实体融合方法,提出一种基于上下文门控机制的多模态神经机器翻译模型。其中,以多模态神经机器翻译模型最为基础的翻译模型,通过引入上下文门控机制,对模型进行优化,以进一步提升模型的翻译质量。试验结果表明,与基线模型以及引入的其他翻译模型相比,研究所构建模型能够进行效果更好的英语翻译,在BLEU,METETOR以及TER三个评价指标上,在Multi30k-16以及Multi30k-17数据集上分别达到了63.0、77.5、23.1以及62.8、76.4、23.5;与基线模型相比,研究所构建模型在忠实度和流畅度上均得到了明显提升。综上,研究所构建的基于上下文门控机制的多模态神经机器翻译模型的性能良好,能够有效提升翻译质量,将其应用于实际多语言英语翻译场景时能够取得良好的翻译效果,可行性较高。
To further improve the effectiveness of English translation in multilingual scenarios,a multimodal neural machine translation model based on context gating mechanism is proposed using a cross modal entity fusion method.Among them,the translation model based on the multimodal neural machine translation model is optimized by introducing context gating mechanism to further improve the translation quality of the model.The experimental results show that compared with the baseline model and other translation models introduced,the model constructed by the research institute can perform better English translation.On the three evaluation indicators of BLEU,METR,and TER,it achieved 63.0,77.5,23.1,62.8,76.4,and 23.5 on the Multi30k-16 and Multi30k-17 datasets,respectively;Compared with the baseline model,the model constructed by the research institute has achieved significant improvements in both fidelity and fluency.In summary,the multimodal neural machine translation model based on context gating mechanism constructed by the research institute has good performance and can effectively improve translation quality.When applied to actual multilingual English translation scenarios,it can achieve good translation results and is highly feasible.
作者
陈折
解辰
CHEN Zhe;XIE Chen(Yangling Vocational&Technical College,Yangling Shaanxi 712100,China;Shaanxi Province Institute of Water Resources and Electric Power Investigation and Design,Xi’an 710005,China)
出处
《自动化与仪器仪表》
2024年第8期247-250,共4页
Automation & Instrumentation
基金
杨凌职业技术学院2022年校内教育教学改革研究项目《人工智能嵌入模式下高职英语翻转课堂混合教学和实践研究》(JG22099)。
关键词
英语翻译
实体融合
MNMT模型
门控机制
注意力机制
English translation
entity fusion
MNMT model
gate control mechanism
attention mechanism