Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural...Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding.展开更多
Question answering is an important problem that aims to deliver specific answers to questions posed by humans in natural language.How to efficiently identify the exact answer with respect to a given question has becom...Question answering is an important problem that aims to deliver specific answers to questions posed by humans in natural language.How to efficiently identify the exact answer with respect to a given question has become an active line of research.Previous approaches in factoid question answering tasks typically focus on modeling the semantic relevance or syntactic relationship between a given question and its corresponding answer.Most of these models suffer when a question contains very little content that is indicative of the answer.In this paper,we devise an architecture named the temporality-enhanced knowledge memory network(TE-KMN) and apply the model to a factoid question answering dataset from a trivia competition called quiz bowl.Unlike most of the existing approaches,our model encodes not only the content of questions and answers,but also the temporal cues in a sequence of ordered sentences which gradually remark the answer.Moreover,our model collaboratively uses external knowledge for a better understanding of a given question.The experimental results demonstrate that our method achieves better performance than several state-of-the-art methods.展开更多
基金The National Natural Science Foundation of China(No.61502095).
文摘Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding.
基金supported by the National Basic Research Program(973)of China(No.2015CB352302)the National Natural Science Foundation of China(Nos.61625107,U1611461,U1509206,and 61402403)+2 种基金the Key Program of Zhejiang Province,China(No.2015C01027)the Chinese Knowledge Center for Engineering Sciences and Technologythe Fundamental Research Funds for the Central Universities,China
文摘Question answering is an important problem that aims to deliver specific answers to questions posed by humans in natural language.How to efficiently identify the exact answer with respect to a given question has become an active line of research.Previous approaches in factoid question answering tasks typically focus on modeling the semantic relevance or syntactic relationship between a given question and its corresponding answer.Most of these models suffer when a question contains very little content that is indicative of the answer.In this paper,we devise an architecture named the temporality-enhanced knowledge memory network(TE-KMN) and apply the model to a factoid question answering dataset from a trivia competition called quiz bowl.Unlike most of the existing approaches,our model encodes not only the content of questions and answers,but also the temporal cues in a sequence of ordered sentences which gradually remark the answer.Moreover,our model collaboratively uses external knowledge for a better understanding of a given question.The experimental results demonstrate that our method achieves better performance than several state-of-the-art methods.