期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
基于幼儿真实生长的“幼小衔接”策略
1
作者 徐琳 《江西教育》 2023年第43期90-91,共2页
“幼小衔接”是指幼儿园和小学两个不同学习阶段的有效衔接环节。幼儿园和小学两个阶段的平稳过渡,对幼儿的人生道路有着至关重要的影响。因此,幼儿教师应积极开展“幼小衔接”活动,帮助幼儿顺利地实现从幼儿园到小学的过渡。
关键词 “幼小衔接” 合作共育 “双向链接”
原文传递
A multi-attention RNN-based relation linking approach for question answering over knowledge base 被引量:1
2
作者 Li Huiying Zhao Man Yu Wenqi 《Journal of Southeast University(English Edition)》 EI CAS 2020年第4期385-392,共8页
Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural... Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding. 展开更多
关键词 question answering over knowledge base(KBQA) entity linking relation linking multi-attention bidirectional long short-term memory(Bi-LSTM) large-scale complex question answering dataset(LC-QuAD)
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部