期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Mathematical Named Entity Recognition Based on Adversarial Training and Self-Attention
1
作者 Qiuyu Lai Wang Kang +2 位作者 Lei Yang Chun Yang Delin Zhang 《Intelligent Automation & Soft Computing》 2024年第4期649-664,共16页
Mathematical named entity recognition(MNER)is one of the fundamental tasks in the analysis of mathematical texts.To solve the existing problems of the current neural network that has local instability,fuzzy entity bou... Mathematical named entity recognition(MNER)is one of the fundamental tasks in the analysis of mathematical texts.To solve the existing problems of the current neural network that has local instability,fuzzy entity boundary,and long-distance dependence between entities in Chinese mathematical entity recognition task,we propose a series of optimization processing methods and constructed an Adversarial Training and Bidirectional long shortterm memory-Selfattention Conditional random field(AT-BSAC)model.In our model,the mathematical text was vectorized by the word embedding technique,and small perturbations were added to the word vector to generate adversarial samples,while local features were extracted by Bi-directional Long Short-Term Memory(BiLSTM).The self-attentive mechanism was incorporated to extract more dependent features between entities.The experimental results demonstrated that the AT-BSAC model achieved a precision(P)of 93.88%,a recall(R)of 93.84%,and an F1-score of 93.74%,respectively,which is 8.73%higher than the F1-score of the previous Bi-directional Long Short-Term Memory Conditional Random Field(BiLSTM-CRF)model.The effectiveness of the proposed model in mathematical named entity recognition. 展开更多
关键词 Named entity recognition BiLSTM-CRF adversarial training selfattentive mechanism mathematical texts
下载PDF
Towards better entity linking 被引量:2
2
作者 Mingyang LI Yuqing XING +1 位作者 Fang KONG Guodong ZHOU 《Frontiers of Computer Science》 SCIE EI CSCD 2022年第2期55-67,共13页
As one of the most important components in knowledge graph construction,entity linking has been drawing more and more attention in the last decade.In this paper,we propose two improvements towards better entity linkin... As one of the most important components in knowledge graph construction,entity linking has been drawing more and more attention in the last decade.In this paper,we propose two improvements towards better entity linking.On one hand,we propose a simple but effective coarse-to-fine unsupervised knowledge base(KB)extraction approach to improve the quality of KB,through which we can conduct entity linking more efficiently.On the other hand,we propose a highway network framework to bridge key words and sequential information captured with a self-attention mechanism to better represent both local and global information.Detailed experimentation on six public entity linking datasets verifies the great effectiveness of both our approaches. 展开更多
关键词 entity linking knowledge base extraction selfattention mechanism highway network
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部