期刊文献+

基于语言模型及循环卷积神经网络的事件检测 被引量:4

Event detection via recurrent and convolutional networks based on language model
下载PDF
导出
摘要 目前,事件检测的难点在于一词多义和多事件句的检测.为了解决这些问题,提出了一个新的基于语言模型的带注意力机制的循环卷积神经网络模型(recurrent and convolutional neural network with attention based on language models,LM-ARCNN).该模型利用语言模型计算输入句子的词向量,将句子的词向量输入长短期记忆网络获取句子级别的特征,并使用注意力机制捕获句子级别特征中与触发词相关性高的特征,最后将这两部分的特征输入到包含多个最大值池化层的卷积神经网络,提取更多上下文有效组块.在ACE2005英文语料库上进行实验,结果表明,该模型的 F 1 值为74.4%,比现有最优的文本嵌入增强模型(DEEB)高0.4%. Now main difficulties of event detection lie in polysemy and multi-event detection.To overcome these difficulties,we propose a novel recurrent and convolutional network with attention based on language model(LM-ARCNN).The model first learns word embeddings from Language Models(ELMo),and places these learned embeddings into a long-short term memory neural network (LSTM) which can capture sentence-level features.Then it utilizes attention mechanism to learn information from the learned sentence features to find the features which are more closely relative to candidate trigger words.Finally,it places these learned sentence features and attention features into a multi-pooling convolutional networks (DMCNN) which uses a dynamic multi-pooling layer according to event trigger to reserve more crucial context chunks.Experiments in ACE2005 English corpus show that the model achieves the state-of-the-art performance with F 1 value is 74.4%.
作者 施喆尔 陈锦秀 SHI Zheer;CHEN Jinxiu(School of Information Science and Engineering,Xiamen University,Xiamen 361005,China)
出处 《厦门大学学报(自然科学版)》 CAS CSCD 北大核心 2019年第3期442-448,共7页 Journal of Xiamen University:Natural Science
基金 国家自然科学基金(60803078) 福建省自然科学基金(2010J01351) 教育部海外留学回国人员科研启动基金
关键词 事件检测 语言模型词嵌入 长短期记忆网络 动态多池化卷积神经网络 注意力机制 event detection embeddings from language models (ELMo) long short-term memory neural network(LSTM) dynamic multi-pooling convolutional neural networks (DMCNN) attention mechanism
  • 相关文献

同被引文献34

引证文献4

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部