期刊文献+

Multi-head attention graph convolutional network model:End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network

下载PDF
导出
摘要 At present,the entity and relation joint extraction task has attracted more and more scholars'attention in the field of natural language processing(NLP).However,most of their methods rely on NLP tools to construct dependency trees to obtain sentence structure information.The adjacency matrix constructed by the dependency tree can convey syntactic information.Dependency trees obtained through NLP tools are too dependent on the tools and may not be very accurate in contextual semantic description.At the same time,a large amount of irrelevant information will cause redundancy.This paper presents a novel end-to-end entity and relation joint extraction based on the multihead attention graph convolutional network model(MAGCN),which does not rely on external tools.MAGCN generates an adjacency matrix through a multi-head attention mechanism to form an attention graph convolutional network model,uses head selection to identify multiple relations,and effectively improve the prediction result of overlapping relations.The authors extensively experiment and prove the method's effectiveness on three public datasets:NYT,WebNLG,and CoNLL04.The results show that the authors’method outperforms the state-of-the-art research results for the task of entities and relation extraction.
出处 《CAAI Transactions on Intelligence Technology》 SCIE EI 2023年第2期468-477,共10页 智能技术学报(英文)
基金 State Key Program of National Natural Science of China,Grant/Award Number:61533018 National Natural Science Foundation of China,Grant/Award Number:61402220 Philosophy and Social Science Foundation of Hunan Province,Grant/Award Number:16YBA323 Natural Science Foundation of Hunan Province,Grant/Award Number:2020JJ4525 Scientific Research Fund of Hunan Provincial Education Department,Grant/Award Numbers:18B279,19A439。
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部