期刊文献+

融合TCN与BiLSTM+Attention模型的疫情期间文本情感分析 被引量:8

Text sentiment analysis during the epidemic based on TCN and BiLSTM+Attention fusion model
下载PDF
导出
摘要 鉴于目前主流的文本情感分析方法存在难以解决长期依赖和对上下文信息使用不足的缺陷,本文首次提出将时序卷积网络(TCN)和BiLSTM+Attention模型融合的文本情感分析模型。该模型利用TCN的因果卷积和扩张卷积结构获取更高层次的文本序列特征,并通过双向长短期记忆网络(BiLSTM)进一步学习上下文相关信息的情感特征;最后,引入自注意力机制(Self-Attention)帮助模型优化特征向量,提高情感分类的准确度。在新型冠状病毒疫情期间的微博文本数据集上进行对比实验,结果表明该模型的性能相较于其它模型有明显的提升。 At present,there are some difficulties in solving the problems of long-term dependence and the insufficient use of contextual information in the mainstream text sentiment analysis methods.In this paper,a text sentiment analysis model combining Temporal Convolutional Network(TCN)with BiLSTM+Attention model is proposed for the first time.This model uses TCN's causal convolution and extended convolution structures to obtain higher-level text sequence features,and further learns the emotional features of contextual information through the Bidirectional Long Short-Term Memory network(BiLSTM).Finally,the Self-Attention mechanism(Self-Attention)is added to optimize the feature vectors of this model and improve the accuracy of sentiment classification.A comparative verification experiment is conducted on the Weibo text data set during the novel coronavirus epidemic.The results show that the performance of this model is more significantly improved than that of other models.
作者 贵向泉 高祯 李立 GUI Xiangquan;GAO Zhen;LI Li(School of Computer and Communication,Lanzhou University of Technology,Lanzhou 730050,China)
出处 《西安理工大学学报》 CAS 北大核心 2021年第1期113-121,共9页 Journal of Xi'an University of Technology
基金 国家自然科学基金资助项目(61862040)。
关键词 文本情感分析 时序卷积网络 双向长短期记忆网络 自注意力机制 疫情期间微博文本 text sentiment analysis TCN BiLSTM Self-Attention mechanism Weibo text during the epidemic
  • 引文网络
  • 相关文献

参考文献6

二级参考文献37

  • 1Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 2Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 3Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 4Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 5Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 6Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 7Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 8Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.
  • 9Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577.
  • 10Le P, Zuidema W. Inside-Outside Semantics: A Framework for Neural Models of Semantic Composi tlon[C]//Proceeding of the Deep Learning and Rep- resentation Learning Workshop: NIPS 2014.

共引文献256

同被引文献75

引证文献8

二级引证文献5

;
使用帮助 返回顶部