期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
基于层级池化序列匹配的知识图谱复杂问答最优查询图选择方法
1
作者 王冬 周思航 +1 位作者 黄健 张中杰 《系统工程与电子技术》 EI CSCD 北大核心 2024年第8期2686-2695,共10页
在处理知识图谱复杂问答任务时,传统的查询图语义解析方法需要在排序阶段对大量结构复杂的候选查询图进行语义编码,用以获得各自多维特征表示。然而,在编码过程中采用的全局最大或平均池化操作通常存在对代表性特征提取能力不足的问题... 在处理知识图谱复杂问答任务时,传统的查询图语义解析方法需要在排序阶段对大量结构复杂的候选查询图进行语义编码,用以获得各自多维特征表示。然而,在编码过程中采用的全局最大或平均池化操作通常存在对代表性特征提取能力不足的问题。针对以上问题,提出一种基于层级池化序列匹配的最优查询图选择方法。在实现候选查询图的交互建模过程中,同时采用层级池化滑动窗口技术分层提取问句和查询图序列对的局部显著性特征与全局语义特征,使得到的特征向量更好地用于候选查询图的语义匹配打分。所提方法在两个流行的复杂问答数据集MetaQA和WebQuestionsSP上开展广泛实验。实验结果表明:引入层级池化操作能够有效提取复杂查询图序列的代表性语义特征,增强原有排序模型的交互编码能力,有助于进一步提升知识图谱复杂问答系统的性能。 展开更多
关键词 知识图谱复杂问答 查询图语义解析 层级池化 交互编码
下载PDF
融合图注意力的复杂时序知识图谱推理问答模型
2
作者 蒋汶娟 过弋 付娇娇 《计算机应用》 CSCD 北大核心 2024年第10期3047-3057,共11页
在时序知识图谱问答(TKGQA)任务中,针对模型难以捕获并利用问句中隐含的时间信息增强模型的复杂问题推理能力的问题,提出一种融合图注意力的时序知识图谱推理问答(GACTR)模型。所提模型采用四元组形式的时序知识库(KB)进行预训练,同时... 在时序知识图谱问答(TKGQA)任务中,针对模型难以捕获并利用问句中隐含的时间信息增强模型的复杂问题推理能力的问题,提出一种融合图注意力的时序知识图谱推理问答(GACTR)模型。所提模型采用四元组形式的时序知识库(KB)进行预训练,同时引入图注意力网络(GAT)以有效捕获问句中隐式时间信息;通过与RoBERTa(Robustly optimized Bidirectional Encoder Representations from Transformers pretraining approach)模型训练的关系表示进行集成,进一步增强问句的时序关系表示;将该表示与预训练的时序知识图谱(TKG)嵌入相结合,以获得最高评分的实体或时间戳作为答案预测结果。在最大的基准数据集CRONQUESTIONS上的实验结果显示,GACTR模型在时序推理模式下能更好地捕获隐含时间信息,有效提升模型的复杂推理能力。与基线模型CRONKGQA(Knowledge Graph Question Answering on CRONQUESTIONS)相比,GACTR模型在处理复杂问题类型和时间答案类型上的Hits@1结果分别提升了34.6、13.2个百分点;与TempoQR(Temporal Question Reasoning)模型相比,分别提升了8.3、2.8个百分点。 展开更多
关键词 时序知识图谱 复杂问答 图注意力网络 时序推理 时序关系表示
下载PDF
A multi-attention RNN-based relation linking approach for question answering over knowledge base 被引量:1
3
作者 Li Huiying Zhao Man Yu Wenqi 《Journal of Southeast University(English Edition)》 EI CAS 2020年第4期385-392,共8页
Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural... Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding. 展开更多
关键词 question answering over knowledge base(KBQA) entity linking relation linking multi-attention bidirectional long short-term memory(Bi-LSTM) large-scale complex question answering dataset(LC-QuAD)
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部