期刊文献+

Leveraging Structured Information from a Passage to Generate Questions

原文传递
导出
摘要 Question Generation(QG)is the task of utilizing Artificial Intelligence(AI)technology to generate questions that can be answered by a span of text within a given passage.Existing research on QG in the educational field struggles with two challenges:the mainstream QG models based on seq-to-seq fail to utilize the structured information from the passage;the other is the lack of specialized educational QG datasets.To address the challenges,a specialized QG dataset,reading comprehension dataset from examinations for QG(named RACE4QG),is reconstructed by applying a new answer tagging approach and a data-filtering strategy to the RACE dataset.Further,an end-to-end QG model,which can exploit the intra-and inter-sentence information to generate better questions,is proposed.In our model,the encoder utilizes a Gated Recurrent Units(GRU)network,which takes the concatenation of word embedding,answer tagging,and Graph Attention neTworks(GAT)embedding as input.The hidden states of the GRU are operated with a gated self-attention to obtain the final passage-answer representation,which will be fed to the decoder.Results show that our model outperforms baselines on automatic metrics and human evaluation.Consequently,the model improves the baseline by 0.44,1.32,and 1.34 on BLEU-4,ROUGE-L,and METEOR metrics,respectively,indicating the effectivity and reliability of our model.Its gap with human expectations also reflects the research potential.
出处 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2023年第3期464-474,共11页 清华大学学报(自然科学版(英文版)
基金 This work was supported by the National Natural Science Foundation of China(No.62166050) Yunnan Fundamental Research Projects(No.202201AS070021) Yunnan Innovation Team of Education Informatization for Nationalities,Scientific Technology Innovation Team of Educational Big Data Application Technology in University of Yunnan Province,and Yunnan Normal University Graduate Research and innovation fund in 2020(No.ysdyjs2020006).
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部