期刊文献+

基于提示学习增强BERT的理解能力

Prompt learning enhance the understanding ability of BERT
下载PDF
导出
摘要 提示学习旨在利用提示模板减小语言模型的预训练任务和下游任务间的差距。其难点在于提示模板的设计,为此,文中在构造提示模板的过程中,提出一个通过自动搜索离散提示对连续提示优化的新方法。其中,自动搜索提示基于双向Transformer编码器(Bidirectional Encoder Representation from Transformers, BERT)的预训练任务掩码语言模型训练,连续提示优化是训练自动搜索输出的离散提示在连续空间内的映射张量,根据损失函数对提示模板进行训练。实验表明,在公共基准SuperGLUE中,基于提示学习的BERT相比于原始的BERT模型在准确率和F1值上均有显著的提升。 Prompt learning aims to reduce the gap between the pre-training task and the downstream task of the language model using prompt templates.The difficulty lies in the design of prompt templates.This paper proposes a new method for optimizing continuous prompts by automatically searching discrete prompts in the process of constructing prompt templates.The automatic search prompt is based on the Bidirectional Encoder Representation from Transformers(BERT)pre-trained task mask language model training.The continuous prompt optimization is to train the mapping tensor of the discrete prompt templates of the auto-search output in the continuous space.The prompt templates are trained according to the loss function.Experiments show that the prompt-based learning BERT in the public benchmark SuperGLUE shows significant improvements in accuracy and F1 values compared to the original BERT model.
作者 陈亚当 杨刚 王铎霖 余文斌 CHEN Ya-dang;YANG Gang;WANG Duo-lin;YU Wen-bin(School of Computer Science,Nanjing University of Information Science and Technology,Nanjing 210044,China;Engineering Research Center of Digital Forensics,Ministry of Education,Nanjing University of Information Science and Technology,Nanjing 210044,China)
出处 《信息技术》 2024年第6期87-93,共7页 Information Technology
基金 国家自然科学基金(61802197)。
关键词 提示学习 双向Transformer编码器 自然语言处理 连续提示优化 掩码语言模型 prompt learning Bidirectional Encoder Representation from Transformers natural language processing continuous prompt optimization mask language model
  • 相关文献

参考文献10

二级参考文献17

共引文献119

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部