期刊文献+

基于多注意力的中文命名实体识别

Chinese Named Entity Recognition Based on Multi-Attention
下载PDF
导出
摘要 笔者在基于神经网络的命名实体识别基础上,提出了改进的中文命名实体识别方法,通过调整网络中间的部分架构,引入Transformer编码模型,在没有添加文本外部信息的情况下,研究学习文本语句自身含义的方法,通过多注意力的学习增强文本的特征表示,捕捉更多字符间的关系,同时解决了长短期记忆网络不能并行计算的问题,并在企业舆情数据集上进行了实验和测试。与传统方法进行对比,验证了该方法可有效提高中文命名实体识别的准确率。 An improved Chinese named entity recognition method was put forward based on the neural network with adjusting some architectures in the middle of the network and adding the Transformer model. We did research on the way of learning text sentences meaning without adding any external information into the texts. More effective features can be extracted than Long-Short Term Memory on text information by multi-attention mechanism as a consequence of capturing deeper relationship between words in the sentence. It also solves the problem that calculation can’t run in parallel faced by Long-Short Term Memory. It was proved that the proposed method can improve the accuracy of Chinese named entity recognition compared to traditional model based on the experimental results carried out on the Chinese enterprise opinion datasets.
作者 顾凌云 Gu Lingyun(Shanghai IceKredit Information Technology Co.,Ltd.,Shanghai 200120,China)
出处 《信息与电脑》 2019年第9期41-44,48,共5页 Information & Computer
关键词 神经网络 命名实体识别 注意力 neural network named entity recognition attention
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部