期刊文献+

用于目标情感分类的多跳注意力深度模型 被引量:3

A Multi-Hop Attention Deep Model for Aspect-Level Sentiment Classification
下载PDF
导出
摘要 文本情感分类是近年来自然语言处理领域的研究热点,旨在对文本蕴含的主观倾向进行分析,其中,基于特定目标的细粒度情感分类问题正受到越来越多的关注。在传统的深度模型中加入注意力机制,可以使分类性能显著提升。针对中文的语言特点,提出一种结合多跳注意力机制和卷积神经网络的深度模型(MHA-CNN)。该模型利用多维组合特征弥补一维特征注意力机制的不足,可以在没有任何先验知识的情况下,获取更深层次的目标情感特征信息。相对基于注意力机制的LSTM网络,该模型训练时间开销更小,并能保留特征的局部词序信息。最后在一个网络公开中文数据集(包含6类领域数据)上进行实验,取得了比普通深度网络模型、基于注意力机制的LSTM模型以及基于注意力机制的深度记忆网络模型更好的分类效果。 Text sentiment classification is a hot topic in the field of natural language processing in recent years.It aims to analyze the subjective sentiment polarity of text.More and more attention has been paid to the problem of fine grained sentiment classification based on specific aspects.In traditional deep models,the attention mechanism can significantly improve the classification performance.Based on the characteristics of Chinese language,a deep model combining multi-hop attention mechanism and convolutional neural network (MHA-CNN) is proposed.The model makes use of the multidimensional combination features to remedy the deficiency of one dimensional feature attention mechanism,and can get deeper aspect sentiment feature information without any prior knowledge.Relative to the attention mechanism based long short-term memory (LSTM) network,the model has smaller time overhead and can retain word order information of the characteristic part.Finally,we conduct experiments on a network open Chinese data set (including 6 kinds of field data),and get better classification results than the ordinary deep network model,the attention-based LSTM model and the attention-based deep memory network model.
作者 邓钰 雷航 李晓瑜 林奕欧 DENG Yu;LEI Hang;LI Xiao-yu;LIN Yi-ou(School of Information and Software Engineering,University of Electronic Science and Technology of China,Chengdu 610054)
出处 《电子科技大学学报》 EI CAS CSCD 北大核心 2019年第5期759-766,共8页 Journal of University of Electronic Science and Technology of China
基金 国家自然科学基金(61502082) 中央高校基本科研业务费(ZYGX2014J065)
关键词 目标情感分类 注意力机制 卷积神经网络 深度学习 自然语言处理 aspect-based sentiment categorization attention mechanism convolutional neural network deep learning natural language processing
  • 相关文献

参考文献4

二级参考文献23

  • 1Pang B, Lee L. Seeing stars: Exploiting class relation- ships for sentiment categorization with respect to rating scales[C]//Proceedings o~ the 43rd Annual Meeting on Association for Computational Linguistics. Association for Computational Linguistics, 2005: 115-124.
  • 2LeCun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition [C]//Pro- ceedings of the IEEE, 1998, 86(11) : 2278-2324.
  • 3Yih W, He X, Meek C. Semantic parsing for single-rela- tion question answering[C]//Proceedings of ACL 2014.
  • 4Shen Y, He X, Gao J, et al. Learning semantic repre- sentations using convolutional neural networks for web search[C]//Proceedings of the companion publication of the 23rd international conference on World wide web companion. International World Wide Web Confer- ences Steering Committee, 2014: 373-374.
  • 5Blunsom P, Grefenstette E, Kalehbrenner N. A conv- olutional neural network for modelling sentences[C]// Proceedings of the 52nd Annual Meeting of the Associ- ation for Computational Linguistics. 2014.
  • 6Collobert R, Weston J, Bottou L, et al. Natural language processing (almost) from scratch[J].The Journal of Ma- chine Learning Research, 2011, 12: 2493-2537.
  • 7dos Santos C N, Gatti M. Deep convolutional neural networks for sentiment analysis of short texts[C]// Proceedings of the 25th International Conference on Computational Linguistics (COLING). Dublin, Ire-land. 2014.
  • 8Kim Y. Convolutional neural networks for sentence classification[C]//Proceedings of the EMNLP,2014.
  • 9Turney P D. Thumbs up or thumbs down? : semantic orientation applied to Unsupervised classification of reviews[C]//Proceedings of the 40th annual meeting on association for computational linguistics. Associa- tion for Computational Linguistics, 2002: 417-424.
  • 10Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks [C]//Advances in neural information processing sys- tems. 2012: 1097-1105.

共引文献313

同被引文献28

引证文献3

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部