期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
D-BERT: Incorporating dependency-based attention into BERT for relation extraction 被引量:1
1
作者 Yuan Huang Zhixing Li +2 位作者 Wei Deng Guoyin Wang Zhimin Lin 《CAAI Transactions on Intelligence Technology》 EI 2021年第4期417-425,共9页
Relation extraction between entity pairs is an increasingly critical area in natural language processing.Recently the pre-trained bidirectional encoder representation from transformer(BERT)performs excellendy on the t... Relation extraction between entity pairs is an increasingly critical area in natural language processing.Recently the pre-trained bidirectional encoder representation from transformer(BERT)performs excellendy on the text classification or sequence labelling tasks.Here,the high-level syntactic features that consider the dependency between each word and the target entities into the pre-trained language models are incorporated.Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi-granularity features for final relation classification.Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets. 展开更多
关键词 BERT RELATION processing.
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部