期刊文献+

融合文本和路径语义的知识图谱嵌入学习模型 被引量:1

The Model of Knowledge Graph Embedding with Text and Relation Path
下载PDF
导出
摘要 针对现有的融合文本和路径信息的模型未能充分挖掘和利用文本与路径语义的问题,提出了新的知识图谱嵌入学习模型(GETR模型):首先,利用LDA丰富实体描述文本语义并用TWE获取词和主题向量,采用Bi-LSTM模型把词和主题向量编码融入实体向量表示中,以增强结点的语义表达能力;其次,设计了以组合PageRank和余弦相似度算法为策略的随机游走算法,以获取实体间的多步路径,并利用自注意力机制捕获路径的重要语义融入到翻译模型中进行联合训练,从而达到有效过滤路径中的噪声和提高模型效率的目的.最后,在数据集FB15K、FB20K和WN18上,对GETR、TransE、DKRL、TKGE模型进行知识补全和实体分类任务的评测,结果表明:GETR模型具有更好的性能表现,是一种更加高效的知识表示方法. Considering that the existing models cannot completely take advantage of the semantic information of texts and paths,a new model of knowledge graph embedding(named GETR model)is proposed.First,LDA is used to enrich the semantics of an entity description text and TWE is used to obtain word embedding and topic embedding.To enhance the representation of entities,the modified Bi-LSTM model is exploited to encode word embedding and topic embedding.Furthermore,the multiple-step path between two entities is obtained through random walks with the strategy of combining PageRank and Cosine similarity.Additionally,to filter the noise and improve the efficiency of the model,the important semantics of the multi-step path to be used for joint training with the translation model is captured with the self-attention mechanism.Finally,the proposed model GETR,as well as the baseline models TransE,DKRL and TKGE,is evaluated in the tasks of knowledge graph completion and entity classification with three datasets:FB15K,FB20K,and WN18.Experimental results demonstrate that the proposed model outperforms the baseline models,indicating that the new model is more effective for knowledge representation.
作者 肖宝 韦丽娜 李璞 蒋运承 XIAO Bao;WEI Lina;LI Pu;JIANG Yuncheng(School of Electronics and Information Engineering,Beibu Gulf University,Qinzhou 535011,China;School of Computer Science,South China Normal University,Guangzhou 510631,China;Software Engineering College,Zhengzhou University of Light Industry,Zhengzhou 450000,China;School of Information Science and Engineering,Guangxi University for Nationalities,Nanning 530006,China)
出处 《华南师范大学学报(自然科学版)》 CAS 北大核心 2020年第6期103-112,共10页 Journal of South China Normal University(Natural Science Edition)
基金 国家自然科学基金项目(61802352) 广西壮族自治区高校中青年教师科研基础能力提升项目(2019KY0463) 钦州市科学研究与技术开发计划项目(20189903)。
关键词 知识图谱嵌入 随机游走 自注意力机制 多步路径 实体描述文本 knowledge graph embedding random walks self-attention mechanism multiple-step path entity description text
  • 相关文献

参考文献2

二级参考文献86

  • 1Miller G A. WordNet: A lexical database for English [J]. Communications of the ACM, 1995, 38(11): 39-41.
  • 2Bollacker K, Evans C, Paritosh P, et al. Freebase: A collaboratively created graph database for structuring human knowledge [C] //Proe of KDD. New York: ACM, 2008: 1247-1250.
  • 3Miller E. An introduction to the resource description framework [J]. Bulletin of the American Society for Information Science and Technology, 1998, 25(1): 15-19.
  • 4Bengio Y. Learning deep architectures for AI [J]. Foundations and Trends in Machine Learning, 2099, 2 (1) 1-127.
  • 5Bengio Y, Courville A, Vincent P. Representation learning: A review and new perspectives [J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2013, 35(8): 1798-1828.
  • 6Turian J, Ratinov L, Bengio Y. Word representations: A simple and general method for semi-supervised learning [C]// Proc of ACL. Stroudsburg, PA: ACL, 2010:384-394.
  • 7Manning C D, Raghavan P, Schutze H. Introduction to Information Retrieval [M]. Cambridge, UK: Cambridge University Press, 2008.
  • 8Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their eompositionality [C] //Proe of NIPS. Cambridge, MA: MIT Press, 2013:3111-3119.
  • 9Zhao Y, Liu Z, Sun M. Phrase type sensitive tensor indexing model for semantic composition [C] //Proc of AAAI. Menlo Park, CA: AAAI, 2015: 2195-2202.
  • 10Zhao Y, Liu Z, Sun M. Representation learning for measuring entity relatedness with rich information [C] //Proc of IJCAI. San Francisco, CA: Morgan Kaufmann, 2015: 1412-1418.

共引文献270

同被引文献14

引证文献1

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部