期刊文献+

基于Triplet注意力的循环卷积神经网络模型

Circular Convolutional Neural Networks Based on Triplet Attention
下载PDF
导出
摘要 现有的融合文本或邻居信息的知识补全模型忽略文本和邻居之间的相互作用,难以捕获与实体具有较强语义相关性的信息,加上基于卷积神经网络的模型未考虑实体中的关系相关信息,导致预测性能不佳.因此,文中结合文本信息和拓扑邻居信息,提出基于Triplet注意力的循环卷积神经网络模型.首先,通过语义匹配的方式,选取文本描述中与实体具有较强语义相关性的单词.再与拓扑邻居复合作为实体邻居,增强实体表示.然后,重塑实体的融合表示和关系表示.最后,利用Triplet注意力优化卷积输入,使卷积操作能提取实体中与关系相关的特征,提升模型性能.在多个公开数据集上的链路预测实验表明,文中模型性能较优. In the existing knowledge completion models with textual or neighbor information,the interaction between texts and neighbors is ignored.Therefore,it is difficult to capture the information with strong semantic relevance to entities.In addition,the relationship-specific information in the entities is not taken into account in the models based on convolutional neural networks,which results in poor prediction performance.In this paper,a circular convolutional neural network model based on triplet attention is proposed combining textual and neighbor information.Firstly,the words with strong semantic relevance to entities in textual descriptions are selected by semantic matching,and then they are combined with topological neighbors as entity neighbors to enhance entity representations.Next,the fusion representations of the entity and the relation representations are reshaped.Finally,the triplet attention is utilized to optimize the input of the convolution and the convolution operation can extract the features related to the relations in the entities,which improves the model performance.Experiments on several public datasets show that the performance of the proposed model is superior.
作者 汪璟玢 雷晶 张璟璇 孙首男 WANG Jingbin;LEI Jing;ZHANG Jingxuan;SUN Shounan(College of Computer and Data Science,Fuzhou University,Fuzhou 350108)
出处 《模式识别与人工智能》 EI CSCD 北大核心 2022年第2期116-129,共14页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.61672159) 福建省自然科学基金项目(No.2021J01619)资助。
关键词 知识图谱补全 文本信息 拓扑邻居 循环卷积 Triplet注意力 Knowledge Graph Completion Textual Information Topological Neighbors Circular Convolution Triplet Attention
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部