期刊文献+

融合多层注意力的方面级情感分析模型 被引量:11

Aspect-Level Sentiment Analysis Model Incorporating Multi-layer Attention
下载PDF
导出
摘要 方面情感分析旨在分析给定文本中特定方面的情感极性。针对目前的研究方法存在对方面情感注意力引入不足问题,提出了一种融合BERT和多层注意力的方面级情感分类模型(BERTandMulti-LayerAttention,BMLA)。模型首先提取BERT内部多层方面情感注意力信息,将编码后的方面信息与BERT隐藏层表征向量融合设计了多层方面注意力,然后将多层方面注意力与编码输出文本进行级联,进而增强了句子与方面词之间的长依赖关系。在SemEval2014 Task4和AIChallenger 2018数据集上的实验表明,强化目标方面权重并在上下文进行交互对方面情感分类是有效的。 Aspect sentiment analysis aims to analyze the sentiment polarity of a specific aspect in a given text.In order to solve the problem of insufficient introduction of aspect emotional attention in current research methods,this paper proposes an aspect level emotion classification model based on the fusion of BERT and Multi-Layer Attention(BMLA).Firstly,the model extracts the multi-layer aspect emotional attention information from the inner part of BERT,and designs the multi layer aspect attention by fusing the encoded aspect information with the representation vector of the hidden layer of BERT,then cascades the multi-layer aspect attention with the encoded output text,finally enhances the long dependency relationship between sentences and aspect words.Experiments on the SemEval2014 Task4 and the AI Challenger 2018 datasets show that the proposed model is effective to enhance the weight of the target aspect and interact in context for aspect sentiment classification.
作者 袁勋 刘蓉 刘明 YUAN Xun;LIU Rong;LIU Ming(College of Physical Science and Technology,Central China Normal University,Wuhan 430079,China;School of Computer,Central China Normal University,Wuhan 430079,China)
出处 《计算机工程与应用》 CSCD 北大核心 2021年第22期147-152,共6页 Computer Engineering and Applications
基金 国家社会科学基金(19BTQ005)。
关键词 自然语言处理 方面情感分析 BERT 多层注意力 依赖关系 natural language processing aspect sentiment analysis BERT multi-layer attention dependency
  • 相关文献

参考文献5

二级参考文献122

  • 1朱嫣岚,闵锦,周雅倩,黄萱菁,吴立德.基于HowNet的词汇语义倾向计算[J].中文信息学报,2006,20(1):14-20. 被引量:326
  • 2Deerwester S C, Dumais S T, Landauer T K, et al. Indexing by latent semantic analysis. Journal of the American Society for Information Science, 1990.
  • 3Hofmann T. Probabilistic latent semantic indexing//Proceedings of the 22nd Annual International SIGIR Conference. New York: ACM Press, 1999:50-57.
  • 4Blei D, Ng A, Jordan M. Latent Dirichlet allocation. Journal of Machine Learning Research, 2003, 3: 993-1022.
  • 5Griffiths T L, Steyvers M. Finding scientific topics//Proceedings of the National Academy of Sciences, 2004, 101: 5228 5235.
  • 6Steyvers M, Gritfiths T. Probabilistic topic models. Latent Semantic Analysis= A Road to Meaning. Laurence Erlbaum, 2006.
  • 7Teh Y W, Jordan M I, Beal M J, Blei D M. Hierarchical dirichlet processes. Technical Report 653. UC Berkeley Statistics, 2004.
  • 8Dempster A P, Laird N M, Rubin D B. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, 1977, B39(1): 1-38.
  • 9Bishop C M. Pattern Recognition and Machine Learning. New York, USA: Springer, 2006.
  • 10Roweis S. EM algorithms for PCA and SPCA//Advances in Neural Information Processing Systems. Cambridge, MA, USA: The MIT Press, 1998, 10.

共引文献1418

同被引文献93

引证文献11

二级引证文献19

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部