期刊文献+

基于多特征LSTM-Self-Attention文本情感分类 被引量:1

Text Sentiment Classification Based on Multi-Feature LSTM-Self-Attention
下载PDF
导出
摘要 针对自然语言处理情感分析领域中情感分类的问题,提出了一种基于多特征LSTM-Self-Attention的文本情感分类方法。方法以词向量和词性向量为输入,利用LSTM网络模型提取文本的序列特征,并通过在模型中引入自注意力机制(self-attention),从序列特征中提取出句子的语法和语义特征,减少了任务的复杂度。上述方法避免了传统循环神经网络存在的梯度消失和梯度爆炸的问题,极大缩短单词长距离依赖特征之间的距离,提高了分类效果。最后使用中文电影评论数据集进行实验验证,结果表明该方法特征提取能力更强,使得情感分类的准确率提升了1.74%。 Aiming at the problem of sentiment classification in the field of sentiment analysis in natural language processing,a text sentiment classification method based on multi-feature LSTM-Self-Attention was proposed.This method took word vector and part-of-speech vector as input,used the LSTM network model to extract sequence features of text,and introduced a self-attention mechanism into the model to extract grammatical and semantic features of sentences from sequence features,which reduced the complexity of tasks.The method avoided the problem of gradient disappearance and gradient explosion in the traditional Recurrent Neural Network,which greatly shortened the distance between the long-distance dependent features of words and improved the classification effect.Finally,the Chinese film review data set was used for experimental verification.The results show that the feature extraction ability of the method is stronger,and the accuracy of sentiment classification is improved by 1.74%.
作者 谢斌红 董悦闰 潘理虎 张英俊 XIE Bin-hong;DONG Yue-run;PAN Li-hu;ZHANG Ying-jun(Department of Computer Science and Technology,Taiyuan University of Science and Technology,Taiyuan Shanxi 030024,China;Institute of Geographic Science and Natural Resources Research,Chinese Academy of Science,Beijing 100101,China)
出处 《计算机仿真》 北大核心 2021年第11期479-484,489,共7页 Computer Simulation
关键词 情感分类 长短时间记忆网络模型 自注意力机制 词性向量 Sentiment classification LSTM Self-attention mechanism Part of speech vector
  • 相关文献

参考文献8

二级参考文献40

  • 1朱嫣岚,闵锦,周雅倩,黄萱菁,吴立德.基于HowNet的词汇语义倾向计算[J].中文信息学报,2006,20(1):14-20. 被引量:326
  • 2王飒,郑链.基于Fisher准则和特征聚类的特征选择[J].计算机应用,2007,27(11):2812-2813. 被引量:21
  • 3Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 4Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 5Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 6Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 7Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 8Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 9Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 10Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.

共引文献818

同被引文献9

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部