期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
知识问答库组织与管理规范化问题研究
1
作者 谢美萍 《图书馆学研究》 2007年第2期79-82,共4页
本文对我国图书馆数字咨询知识问答库在组织与管理方面展开调查,从规范性的角度对相关问题进行比较研究,提出了相应的要求。
关键词 知识问答库 规范化 数字参考咨询
下载PDF
A multi-attention RNN-based relation linking approach for question answering over knowledge base 被引量:1
2
作者 Li Huiying Zhao Man Yu Wenqi 《Journal of Southeast University(English Edition)》 EI CAS 2020年第4期385-392,共8页
Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural... Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding. 展开更多
关键词 question answering over knowledge base(KBQA) entity linking relation linking multi-attention bidirectional long short-term memory(Bi-LSTM) large-scale complex question answering dataset(LC-QuAD)
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部