期刊文献+

基于实体消岐和多粒度注意力的知识库问答 被引量:3

Knowledge base question answering based on entity disambiguation and multiple granularity attention
下载PDF
导出
摘要 为解决现有知识库问答编码-比较框架的原始信息丢失问题,提出基于实体消岐和多粒度注意力的知识库问答方法。从多个粒度对问题和知识库关系的相关性进行建模,引入双向注意力机制更有效地聚合向量保留原始信息,实现关系检测中字符之间的细粒度对齐。为提高实体链接的准确率,融合双向长短时记忆网络-条件随机场(BiLSTM-CRF)克服对人工特征的依赖,计算问题关系词与候选关系的相似性减少噪声数据实现实体消歧。在SimpleQuestions数据集的实验结果表明,该模型在准确率上有明显提升,达到了94.1%。 To solve the problem of interactive information loss in existing knowledge base question answering encoding-comparing framework,a knowledge base question answering method based on entity disambiguation and multi-granularity attention mechanism was proposed.The relationship between problem and knowledge base was modeled from multiple granularity,and bi-directional attention mechanism was introduced to effectively aggregate vectors to retain original information,and fine-grained alignment between characters in relation detection was realized.To improve the accuracy of entity linking,the combination of BiLSTM and CRF network overcame the dependence on artificial features,and the similarity between the relation words in the question and candidate relations was calculated to reduce noise data and realize entity disambiguation.Experimental results on SimpleQuestions data set show that the accuracy of the model is improved to 94.1%.
作者 何儒汉 唐娇 史爱武 陈佳 李相朋 胡新荣 HE Ru-han;TANG Jiao;SHI Ai-wu;CHEN Jia;LI Xiang-peng;HU Xin-rong(School of Mathematics and Computer Science,Wuhan Textile University,Wuhan 430000,China;Engineering Research Center of Hubei Province for Clothing Information,Wuhan Textile University,Wuhan 430000,China)
出处 《计算机工程与设计》 北大核心 2022年第2期560-566,共7页 Computer Engineering and Design
基金 国家自然科学基金面上基金项目(61170093) 湖北省教育厅科学技术研究计划重点基金项目(D20141603)。
关键词 命名实体识别 实体消岐 关系检测 注意力机制 知识库问答 named entity recognition entity disambiguation relationship detection attention mechanism knowledge based question answering
  • 相关文献

参考文献5

二级参考文献21

共引文献147

同被引文献37

引证文献3

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部