期刊文献+

基于注意力迁移的跨语言关系抽取方法 被引量:1

Cross language relationship extraction method based on attention transfer
下载PDF
导出
摘要 针对互联网上日渐丰富的多语言文本和匮乏大规模标注平行语料库的问题,为了从多语言的信息源挖掘语言间的关联性与扩展知识图谱,提出了基于注意力迁移的跨语言关系提取方法。首先针对语言间的实际平行语料情况,分类进行跨语言平行语料映射,并针对缺乏种子词典的低资源语言对,提出神经网络翻译模型获取目标语言数据集并保存多语言间的对应注意力权重关系,然后利用BERT端对端的联合抽取模型抽取训练数据实体关系特征,反向迁移语言间注意力权重关系,最后利用反向迁移的注意力进行增强的关系抽取。实验表明,该模型的关系提取效果相比其他模型在准确率和回归上都有所提升,在缺乏双语词典情况下也表现出较好的性能。 Aiming at the problem of increasingly rich multilingual texts and lack of large-scale labeled parallel corpora on the Internet,in order to mine the relevance between languages from multilingual information sources and expand the knowledge map,this paper proposed a cross language relationship extraction method based on attention transfer.Firstly,according to the actual parallel corpus between languages,it classified the cross language parallel corpus mapping,and for the low resource language pairs lacking seed dictionaries,it proposed a neural network translation model to obtain the target language data set and save the corresponding attention weight relationship between multiple languages,and then it extracted the entity relationship feature of training data by using BERT end-to-end joint extraction model.Finally,it used the reverse transferred attention to extract the enhanced relationship.Experiments show that the relationship extraction effect of this model is better than other models in accuracy and regression,and also shows better performance in the absence of bilingual dictionary.
作者 吴婧 杨百龙 田罗庚 Wu Jing;Yang Bailong;Tian Luogeng(Dept.of Information&Communication Engineering,Rocket Force University of Engineering,Xi’an 710000,China;Dept.of Information&Communication,National University of Defense Technology,Xi’an 710000,China)
出处 《计算机应用研究》 CSCD 北大核心 2022年第2期417-423,共7页 Application Research of Computers
关键词 神经机器翻译 关系提取 无监督 注意力迁移 BERT预训练 neural machine translation relation extraction unsupervised attention transfer BERT pre-training
  • 相关文献

参考文献2

二级参考文献9

共引文献8

同被引文献13

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部