期刊文献+

预训练语言模型探究 被引量:3

On the Pre Training Language Models
下载PDF
导出
摘要 随着人工智能应用的普及,自然语言处理作为认知智能的核心已经成为人工智能极大的挑战。以GPT、BERT为代表的预训练语言模型的兴起,使问答、摘要、阅读理解等多项自然语言处理任务取得一系列突破,效果显著提升。该文从预训练语言模型的概念、主流技术方法以及应用这3个方面,介绍主流预训练语言模型,以及在自然语言处理领域的影响。 With the popularity of artificial intelligence applications,natural language processing,as the core of cognitive intelligence,has become a great challenge for artificial intelligence.The rise of pre training language models represented by GPT and BERT has made a series of breakthroughs in many natural language processing tasks,such as question answering,abstracting and reading comprehension,and the effect has been significantly improved.This paper introduces the mainstream pre training language model and its influence in the field of natural language processing from three aspects:the concept of pre training language model,the mainstream technical methods and its application.
作者 李景玉 LI Jingyu(School of Telecommunication Engineering,Beijing Polytechnic,Beijing,100176 China)
出处 《科技资讯》 2022年第19期5-9,18,共6页 Science & Technology Information
基金 2022年北京电子科技职业学院校内科技类课题(项目编号:2022X014-KXY)。
关键词 自然语言处理 预训练语言模型 GPT BERT 文本分类 机器阅读理解 Natural language processing Pre trained language models GPT BERT Text classification Machine reading comprehension
  • 相关文献

参考文献4

二级参考文献18

共引文献29

同被引文献42

引证文献3

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部