期刊文献+

Pre-trained models for natural language processing: A survey 被引量:151

原文传递
导出
摘要 Recently, the emergence of pre-trained models(PTMs) has brought natural language processing(NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize existing PTMs based on a taxonomy from four different perspectives. Next,we describe how to adapt the knowledge of PTMs to downstream tasks. Finally, we outline some potential directions of PTMs for future research. This survey is purposed to be a hands-on guide for understanding, using, and developing PTMs for various NLP tasks.
出处 《Science China(Technological Sciences)》 SCIE EI CAS CSCD 2020年第10期1872-1897,共26页 中国科学(技术科学英文版)
基金 the National Natural Science Foundation of China(Grant Nos.61751201 and 61672162) the Shanghai Municipal Science and Technology Major Project(Grant No.2018SHZDZX01)and ZJLab。
  • 相关文献

同被引文献783

引证文献151

二级引证文献502

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部