期刊文献+

基于注意力与神经图灵机的语义关系抽取模型 被引量:1

Semantic relation extraction model via attention based neural Turing machine
下载PDF
导出
摘要 针对语义关系抽取(语义关系分类)中长语句效果不佳和核心词表现力弱的问题,提出了一种基于词级注意力的双向神经图灵机(Ab-NTM)模型。首先,使用神经图灵机(NTM)作为循环神经网络(RNN)的改进,使用长短时记忆(LSTM)网络作为控制器,其互不干扰的存储特性可加强模型在长语句上的记忆能力;然后,构建注意力层组织词级上下文信息,使模型可以加强句中核心词的表现力;最后,输入分类器得到语义关系标签。在SemEval 2010 Task 8公共数据集上的实验表明,该模型获得了86.2%的得分,优于其他方法。 Focusing on the problem of poor memory in long sentences and the lack of core words' influence in semantic relation extraction, an Attention based bidirectional Neural Turing Machine (Ab-NTM) model was proposed. Instead of a Recurrent Neural Network (RNN), a Neural Turing Machine (NTM) was used firstly, and a Long Short-Term Memory (LSTM) network was acted as a controller, which contained larger and non-interfering storage, and it could hold longer memories than the RNN. Secondly, an attention layer was used to organize the context information on the word level so that the model could pay attention to the core words in sentences. Finally, the labels were gotten through the classifier. Experiments on the SemEval-2010 Task 8 dataset show that the proposed model outperforms most state-of-the-art methods with an 86.2% F1-score.
作者 张润岩 孟凡荣 周勇 刘兵 ZHANG Runyan;MENG Fanrong;ZHOU Yong;LIU Bing(School of Computer Science and Technology,China University of Mining and Technology,Xuzhou Jiangsu 221116,China;Institute of Electrics,Chinese Academy of Sciences,Beijing 100080,China)
出处 《计算机应用》 CSCD 北大核心 2018年第7期1831-1838,共8页 journal of Computer Applications
基金 国家自然科学基金面上项目(61572505)~~
关键词 自然语言处理 语义关系抽取 循环神经网络 双向神经图灵机 注意力机制 Natural Language Processing (NLP) semantic relation extraction Recurrent Neural Network (RNN) bidirectional Neural Turing Machine (NTM) attention mechanism
  • 相关文献

参考文献2

二级参考文献42

  • 1车万翔,刘挺,李生.实体关系自动抽取[J].中文信息学报,2005,19(2):1-6. 被引量:115
  • 2董静,孙乐,冯元勇,黄瑞红.中文实体关系抽取中的特征选择研究[J].中文信息学报,2007,21(4):80-85. 被引量:55
  • 3刘克彬,李芳,刘磊,韩颖.基于核函数中文关系自动抽取系统的实现[J].计算机研究与发展,2007,44(8):1406-1411. 被引量:58
  • 4刘群,李素建.基于《知网》的词汇语义相似度的计算[C].台北:第三届汉语词汇语义学研讨会,2002.
  • 5董振东,董强.KDML-知网知识系统描述语言[EB/OL].[2006-06-25].http://www.keenage.com/html/e_index.html.
  • 6ZHOU GUODONG, SU JIAN, ZHANG JIE, et al. Exploring various knowledge in relation extraction[C] // ACL '05: Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2005: 427-434.
  • 7CHAN Y S, ROTH D. Exploiting background knowledge for relation extraction[C] // COLING '10: Proceedings of the 23rd International Conference on Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2010: 152-160.
  • 8SUN A, GRISHMAN R, SEKINE S. Semi-supervised relation extraction with large-scale word clustering[C] //HLT '11: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: Association for Computational Linguistics, 2011:521-529.
  • 9ZHANG MIN, ZHANG JIE, SU JIAN, et al. A composite kernel to extract relations between entities with both flat and structured features[C] // ACL-44: Proceedings of the 21st International Conference on Computational Linguistics and the 44th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2006: 825-832.
  • 10ZHOU GUODONG,ZHANG MIN,JI DONGHONG,et al. Tree kernel-based relation extraction with context-sensitive structured parse tree information[C] // EMNLP-CoNLL 2007:Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Stroudsbing:Association for Computational Linguistics, 2007: 728-736.

共引文献83

同被引文献3

引证文献1

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部