期刊文献+

基于短语注意机制的文本分类 被引量:11

Text Classification Based on Phrase Attention Mechanism
下载PDF
导出
摘要 基于词注意机制的双向循环神经网络在解决文本分类问题时,存在如下问题:直接对词加权生成文本表示会损失大量信息,从而难以在小规模数据集上训练网络。此外,词必须结合上下文构成短语才具有明确语义,且文本语义常常是由其中几个关键短语决定,所以通过学习短语的权重来合成的文本语义表示要比通过学习词的权重来合成的更准确。为此,该文提出一种基于短语注意机制的神经网络框架NN-PA。其架构是在词嵌入层后加入卷积层提取N-gram短语的表示,再用带注意机制的双向循环神经网络学习文本表示。该文还尝试了五种注意机制。实验表明:基于不同注意机制的NN-PA系列模型不仅在大、小规模数据集上都能明显提高分类正确率,而且收敛更快。其中,模型NN-PA1和NN-PA2明显优于主流的深度学习模型,且NN-PA2在斯坦福情感树库数据集的五分类任务上达到目前最高的正确率53.35%。 In text classification,bidirectional recurrent neural network based on word-level attention is defected in the way generating text representation directly from words,which will cause a lot of information loss and make it hard to train the network on a limited data.In fact,words need to be combined into phrases with clear semantics in the context,and the text semantic meaning is often determined by several key phrases,therefore,the text representation generated by learning the weight of phrases may be more precise than that by the words.This paper proposes a novel neural network architecture based on the phrase-level attention mechanism.Specifically,a convolutional layer is added after the word embedding layer to extract the representations of N-gram phrase,and the text representation is learnt by bidirectional recurrent neural network with attention mechanism.We test five kinds of attention mechanisms in the experiment.Experimental results show that a series of NN-PA models based on different attention mechanism can improve classification performance on both of small and large scale datasets,and converge faster.Both NN-PA1 and NN-PA2 models outperform the state-of-art models based on deep learning techniques,and NNPA2 gets 53.35% accuracy on the five-classification task on Stanford Sentiment Treebank,which is best result to our best knowledge.
作者 江伟 金忠 JIANG Wei;JIN Zhong(School of Computer Science and Engineering, Nanjing University of Science Technology, Nanjing, Jiangsu 210094 , China;MOE Key Laboratory of Intelligent Perception and System for High-Dimensional Information, Nanjing University of Science I~ Technology, Nanjing, Jiangsu 210094, China)
出处 《中文信息学报》 CSCD 北大核心 2018年第2期102-109,119,共9页 Journal of Chinese Information Processing
基金 国家自然科学基金(61373063 61375007 61233011 91420201 61472187) 国家重点基础研究发展计划(2014CB349303)
关键词 文本分类 循环神经网络 卷积层 注意机制 text classification recurrent neural network convolutional layer attention mechanism
  • 相关文献

同被引文献83

引证文献11

二级引证文献92

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部