期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Targeted BERT Pre-training and Fine-Tuning Approach for Entity Relation Extraction
1
作者 Chao Li Zhao Qiu 《国际计算机前沿大会会议论文集》 2021年第2期116-125,共10页
Entity relation extraction(ERE)is an important task in the field of information extraction.With the wide application of pre-training language model(PLM)in natural language processing(NLP),using PLM has become a brand ... Entity relation extraction(ERE)is an important task in the field of information extraction.With the wide application of pre-training language model(PLM)in natural language processing(NLP),using PLM has become a brand new research direction of ERE.In this paper,BERT is used to extracting entityrelations,and a separated pipeline architecture is proposed.ERE was decomposed into entity-relation classification sub-task and entity-pair annotation sub-task.Both sub-tasks conduct the pre-training and fine-tuning independently.Combining dynamic and static masking,newVerb-MLM and Entity-MLM BERT pre-training tasks were put forward to enhance the correlation between BERT pre-training and TargetedNLPdownstream task-ERE.Inter-layer sharing attentionmechanismwas added to the model,sharing the attention parameters according to the similarity of the attention matrix.Contrast experiment on the SemEavl 2010 Task8 dataset demonstrates that the new MLM task and inter-layer sharing attention mechanism improve the performance of BERT on the entity relation extraction effectively. 展开更多
关键词 Entity relation extraction BERT Verb-MLM Entity-MLM Inter-layer sharing attention mechanism
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部