摘要
目前关系抽取方法中,传统深度学习方法存在长距离依赖问题,并且未考虑模型输入与输出的相关性。针对以上问题,提出了一种将LSTM(long short-term memory)模型与注意力机制相结合的关系抽取方法。将文本信息向量化,提取文本局部特征,再将文本局部特征导入双向LSTM模型中,通过注意力机制对LSTM模型的输入与输出之间的相关性进行重要度计算,根据重要度获取文本整体特征;最后将局部特征和整体特征进行特征融合,通过分类器输出分类结果。在Sem Eval-2010 task 8语料库上的实验结果表明,该方法的准确率和稳定性较传统深度学习方法有进一步提高,为自动问答、信息检索以及本体学习等领域提供了方法支持。
In the methods of relation extraction,the traditional deep learning method has the problem of long distance depen-dence and does not consider the correlation between input and output of the model.This paper put forward a new relation extraction model,which combining LSTM and attention mechanism.Firstly,the model embedded the text information and then obtained the local feature.Secondly,it introduced the local feature into the bidirectional LSTM model,and used the attention mechanism to calculate the importance probability between the input and output of the LSTM model to obtain the global feature.Finally,it fused the local feature and the global feature and obtained the result of relation extraction by classifier.Experiments were conducted on the SemEval-2010 task 8 corpus.The results show that the accuracy and stability of the method have been further improved,which provides method support for automatic question answering,information retrieval and ontology learning.
作者
王红
史金钏
张志伟
Wang Hong;Shi Jinchuan;Zhang Zhiwei(School of Computer Science&Technology,Civil Aviation University of China,Tianjin 300300,China)
出处
《计算机应用研究》
CSCD
北大核心
2018年第5期1417-1420,1440,共5页
Application Research of Computers
基金
国家自然科学基金资助项目(U1633110
U1533104
U1233113)
关键词
文本信息
语义关系
关系抽取
LSTM
注意力机制
text information
semantic relation
relation extraction
LSTM
attention mechanism