期刊文献+

MTL-BERT:一种结合BERT的中文文本多任务学习模型 被引量:2

MTL-BERT:a Multi-task Learning Model Utilizing Bert for Chinese Text
下载PDF
导出
摘要 单任务学习常常受限于单目标函数的不足,多任务学习能有效利用任务相关性的先验性,故而受到了学界的关注.在中文自然语言处理领域,关于多任务学习的研究极为匮乏,该领域需同时考虑到中文文本特征提取和多任务的建模.本论文提出了一种多任务学习模型MTL-BERT.首先将BERT作为特征提取器以提升模型的泛化性.其次分类和回归是机器学习中的两个主要问题,针对多标签分类和回归的混合任务,提出了一种任务权重自适应框架.该框架下,任务之间的权重由联合模型参数共同训练.最后从模型最大似然角度,理论验证了该多任务学习算法的有效性.在真实中文数据集上的实验表明,MTL-BERT具有较好的计算效果. Single-task learning is often limited by the insufficiency of single-objective functions.Multi-task learning can effectively utilize the prior knowledge of task relatedness,so it has attracted the attention of the academic community.In the field of Chinese natural language processing,the research of multi-task learning is extremely scarce.This field needs to consider both Chinese text feature extraction and the modeling of multi-task.In this paper,we propose a multi-task learning model,MTL-BERT.Firstly,BERT is used as a feature extractor to improve the generalization of the model.Secondly,classification and regression are two main problems in machine learning,a learning framework of weights self-adaption is proposed to solve the mixture of multi-label classification and regression by us.In this framework,weights between tasks are jointly trained with model parameters.Finally,from the perspective of the maximum likelihood,the effectiveness of the multi-task learning algorithm is theoretically verified.Experiments on real Chinese datasets validate that MTL-BERT has good calculation effect.
作者 武乾坤 彭敦陆 WU Qian-kun;PENG Dun-lu(School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai 200093,China)
出处 《小型微型计算机系统》 CSCD 北大核心 2021年第2期291-296,共6页 Journal of Chinese Computer Systems
基金 国家自然科学基金项目(61772342,61703278)资助.
关键词 多任务学习 BERT 自然语言处理 深度学习 multi-task learning BERT natural language processing deep learning
  • 相关文献

参考文献4

二级参考文献9

共引文献18

同被引文献6

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部