期刊文献+

基于Attention机制的卷积神经网络文本分类模型 被引量:13

Convolutional Neural Networks Text Classification Model Based on Attention Mechanism
下载PDF
导出
摘要 文本分类是自然语言处理的重要内容,而有效提取文本全局语义是成功完成分类任务的关键.为了体现卷积神经网络提取特征的非局部重要性,在模型中引入Attention机制并建立了包含4个AttentionCNN层的A-CNN文本分类模型.其中,AttentionCNN层中普通卷积层用于提取局部特征,Attention机制用于生成非局部相关度特征.最后,使用A-CNN模型分别在情感分析、问题分类、问题答案选择等数据集上进行了实验和对比分析.结果表明:相比于其他对比模型,A-CNN模型完成上述3个文本分类任务时的最高精度分别提高了1.9%、4.3%、0.6%,可见A-CNN模型在文本分类任务中具有较高的精度和较强的通用性. Text categorization is an important part of natural language processing.Effective extraction of global semantics is the key to the success of text categorization.In order to emphasize the non-local importance of the extracting feature of convolutional neural networks,an A-CNN text classification model including four Attention CNN layers is established by using Attention mechanism.In the A-CNN model,the general convolution of the Attention CNN layer is used to extract local features,and the Attention mechanism is used to generate feature non-local correlation.Finally,the A-CNN model is experimentally used for the analysis on data sets such as sentiment analysis,problem classification,and question answer selection.Compared with other models,the A-CNN model improves the classification precision of the three above tasks by 1.9%,4.3%,and 0.6%,respectively.The A-CNN model performs higher accuracy in text classification tasks and stronger versatility.
作者 赵云山 段友祥 ZHAO Yunshan;DUAN Youxiang(College of Computer & Communication Engineering,China University of Petroleum,Qingdao 266580,Shandong Province,China)
出处 《应用科学学报》 CAS CSCD 北大核心 2019年第4期541-550,共10页 Journal of Applied Sciences
基金 国家科技重大专项基金(No.2017ZX05009001-09)资助
关键词 文本分类 卷积神经网络 Attention机制 非局部相关度 text categorization convolutional neural network(CNN) Attention mechanism non-local correlation
  • 相关文献

同被引文献123

引证文献13

二级引证文献30

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部