期刊文献+

融合实体类别信息的知识图谱表示学习 被引量:9

Knowledge Graph Representation Learning Fused with Entity Category Information
下载PDF
导出
摘要 知识图谱表示学习通过将实体和关系嵌入连续低维的语义空间中,获取实体和关系的语义关联信息。设计一种融合实体类别信息的类别增强知识图谱表示学习(CEKGRL)模型,构建基于结构与基于类别的实体表示,通过注意力机制捕获实体类别和三元组关系之间的潜在相关性,结合不同实体类别对于某种特定关系的重要程度及实体类别信息进行知识表示学习。在知识图谱补全和三元组分类任务中的实验结果表明,CEKGRL模型在MeanRank和Hit@10评估指标上均取得明显的性能提升,尤其在实体预测任务的Filter设置下相比TKRL模型约分别提升了23.5%和7.2个百分点,具有更好的知识表示学习性能。 Knowledge graph representation learning embeds both entities and relations into a continuous lowdimensional semantic space,so as to obtain the semantic correlation between entities and relations.This paper proposes a Category-Enhanced Knowledge Graph Representation Learning(CEKGRL)model fused with entity category information.The model constructs structure-based and category-based entity representation,and captures the potential correlation between entity categories and triple relations by introducing the attention mechanism.It combines the different importance of different entity categories for a specific relation and entity category information for Knowledge Representation Learning(KRL).The performance of the model in knowledge graph completion and triple classification tasks is tested,and experimental results show that the CEKGRL model has made significant improvements in the indicators of MeanRank and Hit@10,which are increased by about 23.5%and 7.2 percentage points than the TKRL model in the Filter setting of the entity prediction task.The results indicate that the model has better KRL performance.
作者 金婧 万怀宇 林友芳 JIN Jing;WAN Huaiyu;LIN Youfang(Beijing Key Laboratory of Traffic Data Analysis and Mining,School of Computer and Information Technology,Beijing Jiaotong University,Beijing 100044,China)
出处 《计算机工程》 CAS CSCD 北大核心 2021年第4期77-83,共7页 Computer Engineering
基金 国家重点研发计划(2018YFC0830200)。
关键词 知识图谱 知识表示学习 多源信息融合 注意力机制 实体消歧 knowledge graph Knowledge Representation Learning(KRL) multi-source information fusion attention mechanism entity disambiguation
  • 相关文献

参考文献1

二级参考文献3

共引文献49

同被引文献52

引证文献9

二级引证文献113

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部