摘要
实体链接和关系链接作为知识库问答的核心组件链接自然语言问题和知识库信息,通常作为两个独立的任务执行,但该执行方式忽略了链接中产生的信息间的相互影响。同时,将候选实体和关系分别计算相似性的方法没有考虑候选实体和关系的内在联系。提出一种基于神经网络的特征联合和多注意力的实体关系链接方法,运用神经网络对问题、实体、关系以及实体-关系对进行编码和向量表示学习,通过添加注意力机制的方法获取候选实体及关系在问题中的权重信息,在计算实体(关系)向量与问题向量的相似性时加入实体-关系对向量,利用实体-关系对中包含的信息提高链接的精度。在LC-QuAD和QALD-7数据集上的实验结果表明,与Falcon模型相比,该方法至少提高了1%的链接精度。
As the core component of question answering over knowledge base,entity linking and relation linking are often used as two independent tasks to connect the natural language questions to knowledge base information,which ignores the mutual influence between the information generated during the linking process.Moreover,the method for calculating the correlation between the question and the candidate entity or candidate relation ignores the internal connection between the candidate entity or candidate relation,respectively.This study proposes a method based on jointly feature and multi-attention for joint entity and relation linking to solve these problems.First,the question,entity,relation,and entity-relation pair are encoded using the neural network,and the neural network learns the vector representations of all the above.Next,the attention mechanism is added to obtain the weight information of the candidate entity and candidate relation in the question.Finally,the entity-relation pair vector is added when calculating the correlation between the entity(relation)vector and question vector,leveraging the information between entityrelation pair to improve the accuracy of the linking process.The experimental results for the LC-QuAD and QALD-7 datasets show that the method improves the linking accuracy by at least 1%,which is better than the recently improved method named Falcon.
作者
付林
刘钊
邱晨
高峰
FU Lin;LIU Zhao;QIU Chen;GAO Feng(School of Computer Science and Technology,Wuhan University of Science and Technology,Wuhan 430065,China;Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial,Wuhan 430065,China;Big Data Science and Engineering Research Institute,Wuhan University of Science and Technology,Wuhan 430065,China;Key Laboratory of Rich Media Digital Publishing Content Organization and Knowledge Service of Press and Publication Administration,Beijing 100083,China)
出处
《计算机工程》
CAS
CSCD
北大核心
2022年第8期53-61,共9页
Computer Engineering
基金
国家自然科学基金“面向特定领域的知识图谱构建和应用关键技术研究”(U1836118)
国家新闻出版署富媒体数字出版内容组织与知识服务重点实验室开放基金(ZD2021-11/01)。
关键词
知识库问答
联合实体关系链接
实体-关系对
注意力机制
知识图谱
Knowledge Base Question Answering(KBQA)
joint entity and relation linking
entity-relation pair
attention mechanism
knowledge graph