期刊文献+

基于多特征BiGRU-ATT的中文关系抽取方法 被引量:1

Chinese Relationship Extraction Method Based on Multi-Feature BiGRU-ATT
下载PDF
导出
摘要 针对中文复杂句式语法特征,本文提出一种基于多特征注意力-双向门控循环单元(Bidirectional Gated Recurrent Unit At tent ion,BiGRU-ATT)的中文关系抽取模型。首先利用语言技术平台(Language Technology Plantform,LTP)语言工具获取句式语法等特征和核心谓词;其次,通过BiGRU进行深层编码得到包含多种特征的句子编码矩阵;最后,引入注意力机制并将核心谓词作为注意力导向得到句子表示向量,进而通过Softmax预测关系。实验表明,该模型的F_(1)值为85.5%,相比注意力-双向长短期记忆网络(Bidirect ional Long Short-Term Memory At tent ion,BLSTM-ATT)、双向循环神经网络(Bidirect ional Recurrent Neural Network,BRNN)、卷积神经网络(Convolutional Neural Networks,CNN)模型的F_(1)值有较大提升。 This paper proposes a Chinese relational extraction model based on Bidirectional Gated Recurrent Unit Attention(BiGRU-ATT) for Chinese complex sentence syntax features.Firstly,we use the Language Technology Plantform(LTP) language tool to obtain features such as sentence grammar and core predicates;secondly,we use BiGRU to deep encode the sentence encoding matrix containing multiple features;finally,we introduce the attention mechanism and use the core predicates as the attention guide to obtain the sentence representation vector,and then use Softmax is used to predict the relationship.The experiments show that the F1value of this model is 85.5%,compared with that of the attentional-bidirectional Long Short-Term Memory Attention(BLSTM-ATT),Bidirectional Recurrent Neural Network(BRNN),and Convolutional Neural Networks(CNN) models.
作者 冷根 黄逸姿 LENG Gen;HUANG Yizi(School of Computer and Information Engineering,Hubei University,Wuhan Hubei 430062,China)
出处 《信息与电脑》 2022年第11期13-16,共4页 Information & Computer
关键词 关系分类 深度学习 多特征 注意力机制 relation classification deep learning multi-feature attention mechanism
  • 相关文献

参考文献2

二级参考文献16

共引文献168

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部