期刊文献+

融合MacBERT和Talking⁃Heads Attention实体关系联合抽取模型

Entity relation joint extraction model fusing MacBERT and Talking⁃Heads Attention
下载PDF
导出
摘要 针对现有的医学文本关系抽取任务模型在训练过程中存在语义理解能力不足,可能导致关系抽取的效果不尽人意的问题,文中提出一种融合MacBERT和Talking⁃Heads Attention的实体关系联合抽取模型。该模型首先利用MacBERT语言模型来获取动态字向量表达,MacBERT作为改进的BERT模型,能够减少预训练和微调阶段之间的差异,从而提高模型的泛化能力;然后,将这些动态字向量表达输入到双向门控循环单元(BiGRU)中,以便提取文本的上下文特征。BiGRU是一种改进的循环神经网络(RNN),具有更好的长期依赖捕获能力。在获取文本上下文特征之后,使用Talking⁃Heads Attention来获取全局特征。Talking⁃Heads Attention是一种自注意力机制,可以捕获文本中不同位置之间的关系,从而提高关系抽取的准确性。实验结果表明,与实体关系联合抽取模型GRTE相比,该模型F1值提升1%,precision值提升0.4%,recall值提升1.5%。 The existing models for the task of relation extraction of medical text have the deficiency of insufficient semantic comprehension during the training process.This may result in unsatisfactory extraction outcomes.Therefore,a joint extraction model that fuses MacBERT and Talking⁃Heads Attention for entity relation joint extraction is proposed.In the model,the MacBERT language model is utilized to obtain dynamic word vector representations.MacBERT,as an upgraded BERT model,can lessen the differences between the pre⁃training and fine⁃tuning stages,so as to improve the model's generalization capability.The dynamic word vector representations are then fed into a bidirectional gated recurrent unit(BiGRU),so as to extract textual contextual features.BiGRU is an improved recurrent neural network(RNN)with better long⁃term dependency capture.After obtaining the text context features,the Talking⁃Heads Attention is used to obtain the global features.It is a self⁃attentive mechanism that can capture the relations between different locations in the text,so as to improve the accuracy of relation extraction.The experimental results show that the proposed model can improve the F1 value by 1%,the precision value by 0.4%and the recall value by 1.5%in comparison with the entity relation joint extraction model GRTE.
作者 王春亮 姚洁仪 李昭 WANG Chunliang;YAO Jieyi;LI Zhao(Hubei Key Laboratory of Intelligent Vision Based Monitoring for Hydroelectric Engineering,Yichang 443000,China;College of Computer and Information Technology,China Three Gorges University,Yichang 443000,China)
出处 《现代电子技术》 北大核心 2024年第5期127-131,共5页 Modern Electronics Technique
关键词 MacBERT BiGRU 关系抽取 医学文本 Talking⁃Heads Attention 深度学习 全局特征 神经网络 MacBERT BiGRU relation extraction medical text Talking⁃Heads Attention deep learning global feature neural network
  • 相关文献

参考文献4

二级参考文献27

共引文献28

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部