期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Recent advances of neural text generation:Core tasks,datasets,models and challenges 被引量:2
1
作者 jin hanqi CAO Yue +2 位作者 WANG TianMing XING XinYu WAN XiaoJun 《Science China(Technological Sciences)》 SCIE EI CAS CSCD 2020年第10期1990-2010,共21页
In recent years,deep neural network has achieved great success in solving many natural language processing tasks.Particularly,substantial progress has been made on neural text generation,which takes the linguistic and... In recent years,deep neural network has achieved great success in solving many natural language processing tasks.Particularly,substantial progress has been made on neural text generation,which takes the linguistic and non-linguistic input,and generates natural language text.This survey aims to provide an up-to-date synthesis of core tasks in neural text generation and the architectures adopted to handle these tasks,and draw attention to the challenges in neural text generation.We first outline the mainstream neural text generation frameworks,and then introduce datasets,advanced models and challenges of four core text generation tasks in detail,including AMR-to-text generation,data-to-text generation,and two text-to-text generation tasks(i.e.,text summarization and paraphrase generation).Finally,we present future research directions for neural text generation.This survey can be used as a guide and reference for researchers and practitioners in this area. 展开更多
关键词 natural language generation neural text generation AMR-to-text data-to-text text summarization paraphrase generation
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部