期刊文献+

基于词向量与CNN-BIGRU的情感分析研究 被引量:3

Research on Sentiment Analysis Based on Word Vector and CNN-BIGRU
下载PDF
导出
摘要 传统情感分析模型中最重要的两个模型是卷积神经网络(CNN)模型和循环神经网络(RNN)模型,其中CNN只能提取文本局部信息,RNN容易陷入梯度爆炸问题。针对上述问题,提出一种将CNN与双层双向门控循环单元(BIGRU)相结合的方法,该方法结合了CNN能够提取局部特征和双层BIGRU能够提取上下文语义并加强特征信息的优点。此外,在情感分析文本中经常会存在语言上的不规范性,影响文本提取的准确性。针对该问题,提出在原先的词向量计算模型中引入Attention机制来聚焦文本的重要信息。实验结果表明,该模型的准确率相比不引入Attention机制时提高了1.21%,相比CNN-BILSTM模型提高了1.58%,相比CNN-BIGRU模型提高了1.39%。 Two most important models in the traditional sentiment analysis model are convolutional neural network(CNN)model and recurrent neural network(RNN)model,but CNN can only extract the local information of the text,and RNN is easy to fall into the gradient explosion problem.In response to the above problems,a method that combines CNN with two-layer bidirectional gated recurrent unit(BIGRU)is proposed.This method combines the advantages of CNN′s ability to extract local features and two-layer BIGRU′s ability to extract contextual semantics and enhance feature information.In addition,there are often linguistic irregularities in the text of sentiment analysis,which affects the accuracy of text extraction and analysis.In response to this problem,it is proposed to introduce the attention mechanism to the original word vector calculation model to focus on the importance of the text.information.Experimental results show that the accuracy of the model is1.21%higher than that without the attention mechanism,1.58%higher than the CNN-BILSTM model,and 1.39%higher than the CNN-BIGRU model.
作者 吴贵珍 王芳 黄树成 WU Gui-zhen;WANG Fang;HUANG Shu-cheng(School of Computer Science,Jiangsu University of Science and Technology,Zhenjiang 212100,China)
出处 《软件导刊》 2022年第8期27-32,共6页 Software Guide
基金 国家自然科学基金项目(61772244)。
关键词 卷积神经网络 双层双向门控循环神经网络 Attention机制 词向量 情感分析 convolutional neural network two-layer bidirectional gated recurrent neural network attention mechanism word vector sentiment analysis
  • 相关文献

参考文献6

二级参考文献61

共引文献138

同被引文献28

引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部