期刊文献+

基于极性转移和LSTM递归网络的情感分析 被引量:90

Polarity Shifting and LSTM Based Recursive Networks for Sentiment Analysis
下载PDF
导出
摘要 长短时记忆(long short term memory,LSTM)是一种有效的链式循环神经网络(recurrent neural network,R2 NN1),被广泛用于语言模型、机器翻译、语音识别等领域。但由于该网络结构是一种链式结构,不能有效表征语言的结构层次信息,该文将LSTM扩展到基于树结构的递归神经网络(Recursive Neural Network,RNN)上,用于捕获文本更深层次的语义语法信息,并根据句子前后词语间的关联性引入情感极性转移模型。实验证明本文提出的模型优于LSTM、递归神经网络等。 The chain-structured long shortterm memory (LSTM) has been shown to be effective in a wide range of tasks such as language modeling, machine translation and speech recognition. Because it cannot storage the structure of hierarchical information language, we extend it to a tree-structure based recursive neural network to capture more syntactic and semantic information, as well as the sentiment polarity shifting. Compared to LSTM, RNN ete, the proposed model achieves a state-of-the-art performance.
出处 《中文信息学报》 CSCD 北大核心 2015年第5期152-159,共8页 Journal of Chinese Information Processing
基金 国家社会科学基金(14BYY096) 国家自然科学基金(61402419 61272221) 国家高技术研究发展863计划(2012AA011101) 计算语言学教育部重点实验室(北京大学)开放课题(201401) 国家重点基础研究发展计划973课题(2014CB340504) 河南省高等学校重点科研项目(15A520098)
关键词 LSTM 递归神经网络 情感分析 LSTM recursive neural network sentiment analysis
  • 相关文献

参考文献17

  • 1Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 2Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 3Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 4Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 5Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 6Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 7Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 8Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.
  • 9Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577.
  • 10Le P, Zuidema W. Inside-Outside Semantics: A Framework for Neural Models of Semantic Composi tlon[C]//Proceeding of the Deep Learning and Rep- resentation Learning Workshop: NIPS 2014.

二级参考文献18

  • 1朱嫣岚,闵锦,周雅倩,黄萱菁,吴立德.基于HowNet的词汇语义倾向计算[J].中文信息学报,2006,20(1):14-20. 被引量:325
  • 2唐慧丰,谭松波,程学旗.基于监督学习的中文情感分类技术比较研究[J].中文信息学报,2007,21(6):88-94. 被引量:134
  • 3B.Pang,L.Lee.Seeing stars:Exploiting class relationships for sentiment categorization with respect to rating scales[C]Proceedings of the ACL,2005:115-124.
  • 4Y.Bengio,R.Ducharme,P.Vincent,et al.A neural probabilistic language model[J].Journal of Machine Learning Research,2003,3:1137-1155.
  • 5Collobert R,Weston J.A unified architecture for natural language processing:Deep neural networks with multitask learning[C]//Proceedings of the 25th international conference on Machine learning.ACM,2008:160-167.
  • 6Mnih A,Hinton G E.A Scalable Hierarchical Distributed Language Model[C]//Proceedings of NIPS.2008::1081-1088.
  • 7Mikolov T,Karafiát M,Burget L,et al.Recurrent neural network based language model[C]//Proceedingsof INTERSPEECH.2010:1045-1048.
  • 8Mikolov T,Kombrink S,Burget L,et al.Extensions of recurrent neural network language model[C]//Proceedings of Acoustics,Speech and Signal Processing(ICASSP),2011 IEEE International Conference on.IEEE,2011:5528-5531.
  • 9Kombrink S,Mikolov T,Karafiát M,et al.Recurrent Neural Network Based Language Modeling in Meeting Recognition[C]//Proceedings of INTERSPEECH.2011:2877-2880.
  • 10Mikolov T,Chen K,Corrado G,et al.Efficient estimation of word representations in vector space[J].arXiv preprint arXiv:1301.3781,2013.

共引文献622

同被引文献591

引证文献90

二级引证文献746

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部