期刊文献+

用户评论方面级情感分析研究 被引量:12

Research on Aspect-Level Sentiment Analysis of User Reviews
下载PDF
导出
摘要 方面级情感分析是自然语言处理的热门研究方向之一,相比于传统的情感分析技术,基于方面的情感分析是细粒度的,能够判断句子中多个目标的情感倾向,能更加准确地挖掘用户对目标的情感极性。针对以往研究忽略目标单独建模的问题,提出了一种基于双向长短期记忆神经网络(BiLSTM)的交互注意力神经网络模型(Bi-IAN)。该模型通过BiLSTM对目标和上下文分别进行建模,获得目标和上下文的隐藏表示,提取其中的语义信息。接下来利用交互注意模块学习上下文和目标之间的注意力,分别生成目标和上下文的表示,捕捉目标和上下文之内和之间的相关性,并重构评价对象和上下文的表示,最终通过非线性层得到分类结果。在数据集SemEval 2014任务4和Chinese review datasets上的实验训练显示,在正确率和F1-score上,比现有的基准情感分析模型有更好的效果。 Aspect-based sentiment analysis has become one of the hot research directions of natural language processing.Compared with the traditional sentiment analysis technology,aspect-based sentiment analysis is aimed at specific targets in sentences,and can judge the sentiment tendency of multiple targets in a sentence,and more accurately mine the sentiment polarity of the target.It is a fine-grained sentiment analysis technology.Aiming at the fact that the previous research ignored the problem of separate modeling of targets,an interactive attention network model based on bidirectional long short-term memory(Bi-IAN)is proposed.The model uses bidirectional long short-term memory(BiLSTM)to model the targets and the context respectively,to obtain hidden representation and extract the semantic information.Next,the attention vector between the context and the targets is learnt through interactive learning,and then the representation of the target and the context are generated.The relevance within and between the target and the context is captured,the representation of the target and context is reconstructed,and finally the model gets the classification result through the non-linear layer.Experimental training on the dataset SemEval 2014 task 4 and Chinese review datasets shows that the model proposed has better results than the existing benchmark sentiment analysis model in terms of accuracy and F1-score.
作者 陈虹 杨燕 杜圣东 CHEN Hong;YANG Yan;DU Shengdong(School of Information Science and Technology,Southwest Jiaotong University,Chengdu 611756,China)
出处 《计算机科学与探索》 CSCD 北大核心 2021年第3期478-485,共8页 Journal of Frontiers of Computer Science and Technology
基金 国家自然科学基金(61976247) 国家科技支撑计划(2015BAH19F02)。
关键词 方面级情感分析 深度学习 循环神经网络(RNN) 注意力机制 aspect-level sentiment analysis deep learning recurrent neural network(RNN) attention mechanism
  • 相关文献

参考文献2

二级参考文献43

  • 1朱嫣岚,闵锦,周雅倩,黄萱菁,吴立德.基于HowNet的词汇语义倾向计算[J].中文信息学报,2006,20(1):14-20. 被引量:326
  • 2姚天昉,娄德成.汉语语句主题语义倾向分析方法的研究[J].中文信息学报,2007,21(5):73-79. 被引量:77
  • 3Deerwester S C, Dumais S T, Landauer T K, et al. Indexing by latent semantic analysis [J]. Journal of the Association of Information Sience, 1990, 41(6) : 391-407.
  • 4Song Y, Wang H, Wang Z, et al. Short text conceptualization using a probabilistic knowledgebase [C]// Proc of the 22nd Int Joint Conf on Artificial Intelligence (IJCAI). Palo Alto, CA: AAAI, 2011:2330-2336.
  • 5Wang Z, Zhao K, Wang H, et al. Query understanding through knowledge-based conceptualization [C]//Proc of the 24th Int Joint Conf on Artificial Intelligence (IJCAI). Palo Alto, CA: AAAI, 2015:3264-3270.
  • 6Lund K, Burgess C. Producing high-dimensional semantic spaces from lexical co-occurrence[J]. Behavior Research Methods, Instruments,& Computers, 1996, 28(2): 203- 2O8.
  • 7Turney P D, Pantel P. From frequency to meaning: Vector space models of semantics [J]. Journal of Artificial Intelligence Research, 2010, 37(1): 141-188.
  • 8Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model [J]. The Journal of Machine Learning Research, 2003, 3(2): 1137-1155.
  • 9Mikolov T, Karafiat M, Burget L, et al. Recurrent neural network based language model [C] //Proc of the llth Annual Conf of the Int Speech Communication Association. New York: ACM, 2010: 1045-1048.
  • 10Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space [J]. Computing Research Repository, 2013 [2015-12-30]. http://arxiv, org/ pdf/1301. 3781. pdf.

共引文献79

同被引文献53

引证文献12

二级引证文献27

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部